Tue Mar 11 10:40:43 UTC 2025 I: starting to build sparql-wrapper-python/unstable/amd64 on jenkins on '2025-03-11 10:40' Tue Mar 11 10:40:43 UTC 2025 I: The jenkins build log is/was available at https://jenkins.debian.net/userContent/reproducible/debian/build_service/amd64_27/49028/console.log Tue Mar 11 10:40:44 UTC 2025 I: Downloading source for unstable/sparql-wrapper-python=2.0.0-2 --2025-03-11 10:40:44-- http://deb.debian.org/debian/pool/main/s/sparql-wrapper-python/sparql-wrapper-python_2.0.0-2.dsc Connecting to 46.16.76.132:3128... connected. Proxy request sent, awaiting response... 200 OK Length: 2214 (2.2K) [text/prs.lines.tag] Saving to: ‘sparql-wrapper-python_2.0.0-2.dsc’ 0K .. 100% 265M=0s 2025-03-11 10:40:44 (265 MB/s) - ‘sparql-wrapper-python_2.0.0-2.dsc’ saved [2214/2214] Tue Mar 11 10:40:44 UTC 2025 I: sparql-wrapper-python_2.0.0-2.dsc -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 3.0 (quilt) Source: sparql-wrapper-python Binary: python3-sparqlwrapper Architecture: all Version: 2.0.0-2 Maintainer: Debian Python Team <team+python@tracker.debian.org> Uploaders: Christian M. Amsüss <chrysn@fsfe.org>, Homepage: https://rdflib.github.io/sparqlwrapper/ Standards-Version: 4.6.0 Vcs-Browser: https://salsa.debian.org/python-team/packages/sparql-wrapper-python Vcs-Git: https://salsa.debian.org/python-team/packages/sparql-wrapper-python.git Testsuite: autopkgtest-pkg-python Build-Depends: debhelper-compat (= 13), dh-sequence-python3, python3-all, python3-pytest, python3-setuptools, python3-rdflib Package-List: python3-sparqlwrapper deb python optional arch=all Checksums-Sha1: feae58b1ce9b0648039d5a9895818ae2525a3688 131812 sparql-wrapper-python_2.0.0.orig.tar.gz 26567359f40b8c07673e57817ebca78a89bcc552 5692 sparql-wrapper-python_2.0.0-2.debian.tar.xz Checksums-Sha256: 9f2baa0f5cdfdc80ec8f5ad9e4379f15fcd2d0fd7e646c06d6cb7058daa12da1 131812 sparql-wrapper-python_2.0.0.orig.tar.gz dee5ee69d76fe6e7c88e27da9f3c3be25548ba375f49d521167e7a5cc3768eb0 5692 sparql-wrapper-python_2.0.0-2.debian.tar.xz Files: baa200c8f3e8d65ee7c8ba9a7b016dbc 131812 sparql-wrapper-python_2.0.0.orig.tar.gz 6cfaedc25e78869bd3ca332720acc133 5692 sparql-wrapper-python_2.0.0-2.debian.tar.xz -----BEGIN PGP SIGNATURE----- iQJFBAEBCgAvFiEEj23hBDd/OxHnQXSHMfMURUShdBoFAmZ7yfARHHRjaGV0QGRl Ymlhbi5vcmcACgkQMfMURUShdBpYSxAArFloUjy79P4VJMYgZ9mCXFkmum3wxvfd q1aLDS7o22WD1Z4sN30/JOV5eDLnIX7u0QaAZxK8Sm2Mzj+TveWFFsyCb2fQxSrX ABHDKS7BUE8Lu2N7YR6ZD/PJwOlnZ9f6HS4ktIJ2H1N6ZX+KXyUrQ5VbV49N9UYW W1xh1VuJwI8tZyIi1IBlaqos0O70i9vOUURQdXekIapRo9qYgjIsElavuS8YBPTa Bf6Se9U7T3ra3+r8cnL2qSaM0Zf4iSMLpkUIZAvgz2i4hMNnVsR6pnqK5IA9VaAP CQjcOcVAI9DMR1jTg/nMWLp3IG+6ACnHWIL3GkZUJZEH4SCD0pBSsJWPIk+jkQrV Eo72Qer3IdpwotKEb2QG9DFIjrPq7LgzE/VyCWOGNQPkLdJzOLvpNgkTHjWwRvlP +BTtCDsRYWQXTBLP4l2wH8FeGUpme31XLJZQT4hHV3OvZ44VkgXIgZZ+x6dJMmxe QYIE6OKPxVgOfJYkv3XFNSfu3UhAEIX3NGFeW77SX0tYRVx4Jll/AJKYU42zLvvv ph1iIVm0z/QuSpMt6rLsIJdiOhENNrwwwZ5F5knZXD8eP8w/9JVroWF1O4c5VvjI pdTZU31clAV2z7YtzTe8MMGE+ylINefkCn1q/BEKLGta08b/c22pmayI3Og1n698 vSybjMqJuHs= =Wspx -----END PGP SIGNATURE----- Tue Mar 11 10:40:44 UTC 2025 I: Checking whether the package is not for us Tue Mar 11 10:40:44 UTC 2025 I: Starting 1st build on remote node ionos11-amd64.debian.net. Tue Mar 11 10:40:44 UTC 2025 I: Preparing to do remote build '1' on ionos11-amd64.debian.net. Tue Mar 11 11:20:48 UTC 2025 I: Deleting $TMPDIR on ionos11-amd64.debian.net. I: pbuilder: network access will be disabled during build I: Current time: Mon Mar 10 22:40:47 -12 2025 I: pbuilder-time-stamp: 1741689647 I: Building the build Environment I: extracting base tarball [/var/cache/pbuilder/unstable-reproducible-base.tgz] I: copying local configuration W: --override-config is not set; not updating apt.conf Read the manpage for details. I: mounting /proc filesystem I: mounting /sys filesystem I: creating /{dev,run}/shm I: mounting /dev/pts filesystem I: redirecting /dev/ptmx to /dev/pts/ptmx I: policy-rc.d already exists I: Copying source file I: copying [sparql-wrapper-python_2.0.0-2.dsc] I: copying [./sparql-wrapper-python_2.0.0.orig.tar.gz] I: copying [./sparql-wrapper-python_2.0.0-2.debian.tar.xz] I: Extracting source dpkg-source: warning: cannot verify inline signature for ./sparql-wrapper-python_2.0.0-2.dsc: unsupported subcommand dpkg-source: info: extracting sparql-wrapper-python in sparql-wrapper-python-2.0.0 dpkg-source: info: unpacking sparql-wrapper-python_2.0.0.orig.tar.gz dpkg-source: info: unpacking sparql-wrapper-python_2.0.0-2.debian.tar.xz I: Not using root during the build. I: Installing the build-deps I: user script /srv/workspace/pbuilder/1216197/tmp/hooks/D02_print_environment starting I: set BUILDDIR='/build/reproducible-path' BUILDUSERGECOS='first user,first room,first work-phone,first home-phone,first other' BUILDUSERNAME='pbuilder1' BUILD_ARCH='amd64' DEBIAN_FRONTEND='noninteractive' DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=20 ' DISTRIBUTION='unstable' HOME='/root' HOST_ARCH='amd64' IFS=' ' INVOCATION_ID='204bc4751f8e4fbba5d155fa709e2149' LANG='C' LANGUAGE='en_US:en' LC_ALL='C' MAIL='/var/mail/root' OPTIND='1' PATH='/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' PBCURRENTCOMMANDLINEOPERATION='build' PBUILDER_OPERATION='build' PBUILDER_PKGDATADIR='/usr/share/pbuilder' PBUILDER_PKGLIBDIR='/usr/lib/pbuilder' PBUILDER_SYSCONFDIR='/etc' PPID='1216197' PS1='# ' PS2='> ' PS4='+ ' PWD='/' SHELL='/bin/bash' SHLVL='2' SUDO_COMMAND='/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.U3bS0lhH/pbuilderrc_ymzN --distribution unstable --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/unstable-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.U3bS0lhH/b1 --logfile b1/build.log sparql-wrapper-python_2.0.0-2.dsc' SUDO_GID='111' SUDO_UID='106' SUDO_USER='jenkins' TERM='unknown' TZ='/usr/share/zoneinfo/Etc/GMT+12' USER='root' _='/usr/bin/systemd-run' http_proxy='http://46.16.76.132:3128' I: uname -a Linux ionos11-amd64 6.1.0-31-amd64 #1 SMP PREEMPT_DYNAMIC Debian 6.1.128-1 (2025-02-07) x86_64 GNU/Linux I: ls -l /bin lrwxrwxrwx 1 root root 7 Mar 4 11:20 /bin -> usr/bin I: user script /srv/workspace/pbuilder/1216197/tmp/hooks/D02_print_environment finished -> Attempting to satisfy build-dependencies -> Creating pbuilder-satisfydepends-dummy package Package: pbuilder-satisfydepends-dummy Version: 0.invalid.0 Architecture: amd64 Maintainer: Debian Pbuilder Team <pbuilder-maint@lists.alioth.debian.org> Description: Dummy package to satisfy dependencies with aptitude - created by pbuilder This package was created automatically by pbuilder to satisfy the build-dependencies of the package being currently built. Depends: debhelper-compat (= 13), dh-sequence-python3, python3-all, python3-pytest, python3-setuptools, python3-rdflib dpkg-deb: building package 'pbuilder-satisfydepends-dummy' in '/tmp/satisfydepends-aptitude/pbuilder-satisfydepends-dummy.deb'. Selecting previously unselected package pbuilder-satisfydepends-dummy. (Reading database ... 19783 files and directories currently installed.) Preparing to unpack .../pbuilder-satisfydepends-dummy.deb ... Unpacking pbuilder-satisfydepends-dummy (0.invalid.0) ... dpkg: pbuilder-satisfydepends-dummy: dependency problems, but configuring anyway as you requested: pbuilder-satisfydepends-dummy depends on debhelper-compat (= 13); however: Package debhelper-compat is not installed. pbuilder-satisfydepends-dummy depends on dh-sequence-python3; however: Package dh-sequence-python3 is not installed. pbuilder-satisfydepends-dummy depends on python3-all; however: Package python3-all is not installed. pbuilder-satisfydepends-dummy depends on python3-pytest; however: Package python3-pytest is not installed. pbuilder-satisfydepends-dummy depends on python3-setuptools; however: Package python3-setuptools is not installed. pbuilder-satisfydepends-dummy depends on python3-rdflib; however: Package python3-rdflib is not installed. Setting up pbuilder-satisfydepends-dummy (0.invalid.0) ... Reading package lists... Building dependency tree... Reading state information... Initializing package states... Writing extended state information... Building tag database... pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) The following NEW packages will be installed: autoconf{a} automake{a} autopoint{a} autotools-dev{a} bsdextrautils{a} debhelper{a} dh-autoreconf{a} dh-python{a} dh-strip-nondeterminism{a} dwz{a} file{a} gettext{a} gettext-base{a} groff-base{a} intltool-debian{a} libarchive-zip-perl{a} libdebhelper-perl{a} libelf1t64{a} libexpat1{a} libffi8{a} libfile-stripnondeterminism-perl{a} libicu72{a} libmagic-mgc{a} libmagic1t64{a} libpipeline1{a} libpython3-stdlib{a} libpython3.13-minimal{a} libpython3.13-stdlib{a} libreadline8t64{a} libtool{a} libuchardet0{a} libunistring5{a} libxml2{a} m4{a} man-db{a} media-types{a} netbase{a} po-debconf{a} python3{a} python3-all{a} python3-autocommand{a} python3-inflect{a} python3-iniconfig{a} python3-jaraco.context{a} python3-jaraco.functools{a} python3-jaraco.text{a} python3-minimal{a} python3-more-itertools{a} python3-packaging{a} python3-pkg-resources{a} python3-pluggy{a} python3-pyparsing{a} python3-pytest{a} python3-rdflib{a} python3-setuptools{a} python3-typeguard{a} python3-typing-extensions{a} python3-zipp{a} python3.13{a} python3.13-minimal{a} readline-common{a} sensible-utils{a} tzdata{a} The following packages are RECOMMENDED but will NOT be installed: ca-certificates curl libarchive-cpio-perl libltdl-dev libmail-sendmail-perl lynx python3-html5rdf python3-lxml python3-networkx python3-orjson python3-pygments wget 0 packages upgraded, 63 newly installed, 0 to remove and 0 not upgraded. Need to get 29.4 MB of archives. After unpacking 115 MB will be used. Writing extended state information... Get: 1 http://deb.debian.org/debian unstable/main amd64 libpython3.13-minimal amd64 3.13.2-1 [859 kB] Get: 2 http://deb.debian.org/debian unstable/main amd64 libexpat1 amd64 2.6.4-1 [106 kB] Get: 3 http://deb.debian.org/debian unstable/main amd64 python3.13-minimal amd64 3.13.2-1 [2205 kB] Get: 4 http://deb.debian.org/debian unstable/main amd64 python3-minimal amd64 3.13.2-2 [27.1 kB] Get: 5 http://deb.debian.org/debian unstable/main amd64 media-types all 13.0.0 [29.3 kB] Get: 6 http://deb.debian.org/debian unstable/main amd64 netbase all 6.4 [12.8 kB] Get: 7 http://deb.debian.org/debian unstable/main amd64 tzdata all 2025a-2 [259 kB] Get: 8 http://deb.debian.org/debian unstable/main amd64 libffi8 amd64 3.4.7-1 [23.9 kB] Get: 9 http://deb.debian.org/debian unstable/main amd64 readline-common all 8.2-6 [69.4 kB] Get: 10 http://deb.debian.org/debian unstable/main amd64 libreadline8t64 amd64 8.2-6 [169 kB] Get: 11 http://deb.debian.org/debian unstable/main amd64 libpython3.13-stdlib amd64 3.13.2-1 [1979 kB] Get: 12 http://deb.debian.org/debian unstable/main amd64 python3.13 amd64 3.13.2-1 [745 kB] Get: 13 http://deb.debian.org/debian unstable/main amd64 libpython3-stdlib amd64 3.13.2-2 [10.1 kB] Get: 14 http://deb.debian.org/debian unstable/main amd64 python3 amd64 3.13.2-2 [28.1 kB] Get: 15 http://deb.debian.org/debian unstable/main amd64 sensible-utils all 0.0.24 [24.8 kB] Get: 16 http://deb.debian.org/debian unstable/main amd64 libmagic-mgc amd64 1:5.45-3+b1 [314 kB] Get: 17 http://deb.debian.org/debian unstable/main amd64 libmagic1t64 amd64 1:5.45-3+b1 [108 kB] Get: 18 http://deb.debian.org/debian unstable/main amd64 file amd64 1:5.45-3+b1 [43.3 kB] Get: 19 http://deb.debian.org/debian unstable/main amd64 gettext-base amd64 0.23.1-1 [243 kB] Get: 20 http://deb.debian.org/debian unstable/main amd64 libuchardet0 amd64 0.0.8-1+b2 [68.9 kB] Get: 21 http://deb.debian.org/debian unstable/main amd64 groff-base amd64 1.23.0-7 [1185 kB] Get: 22 http://deb.debian.org/debian unstable/main amd64 bsdextrautils amd64 2.40.4-5 [92.4 kB] Get: 23 http://deb.debian.org/debian unstable/main amd64 libpipeline1 amd64 1.5.8-1 [42.0 kB] Get: 24 http://deb.debian.org/debian unstable/main amd64 man-db amd64 2.13.0-1 [1420 kB] Get: 25 http://deb.debian.org/debian unstable/main amd64 m4 amd64 1.4.19-7 [294 kB] Get: 26 http://deb.debian.org/debian unstable/main amd64 autoconf all 2.72-3 [493 kB] Get: 27 http://deb.debian.org/debian unstable/main amd64 autotools-dev all 20220109.1 [51.6 kB] Get: 28 http://deb.debian.org/debian unstable/main amd64 automake all 1:1.17-3 [862 kB] Get: 29 http://deb.debian.org/debian unstable/main amd64 autopoint all 0.23.1-1 [770 kB] Get: 30 http://deb.debian.org/debian unstable/main amd64 libdebhelper-perl all 13.24.1 [90.9 kB] Get: 31 http://deb.debian.org/debian unstable/main amd64 libtool all 2.5.4-4 [539 kB] Get: 32 http://deb.debian.org/debian unstable/main amd64 dh-autoreconf all 20 [17.1 kB] Get: 33 http://deb.debian.org/debian unstable/main amd64 libarchive-zip-perl all 1.68-1 [104 kB] Get: 34 http://deb.debian.org/debian unstable/main amd64 libfile-stripnondeterminism-perl all 1.14.1-2 [19.7 kB] Get: 35 http://deb.debian.org/debian unstable/main amd64 dh-strip-nondeterminism all 1.14.1-2 [8620 B] Get: 36 http://deb.debian.org/debian unstable/main amd64 libelf1t64 amd64 0.192-4 [189 kB] Get: 37 http://deb.debian.org/debian unstable/main amd64 dwz amd64 0.15-1+b1 [110 kB] Get: 38 http://deb.debian.org/debian unstable/main amd64 libunistring5 amd64 1.3-1 [476 kB] Get: 39 http://deb.debian.org/debian unstable/main amd64 libicu72 amd64 72.1-6 [9421 kB] Get: 40 http://deb.debian.org/debian unstable/main amd64 libxml2 amd64 2.12.7+dfsg+really2.9.14-0.2+b2 [699 kB] Get: 41 http://deb.debian.org/debian unstable/main amd64 gettext amd64 0.23.1-1 [1680 kB] Get: 42 http://deb.debian.org/debian unstable/main amd64 intltool-debian all 0.35.0+20060710.6 [22.9 kB] Get: 43 http://deb.debian.org/debian unstable/main amd64 po-debconf all 1.0.21+nmu1 [248 kB] Get: 44 http://deb.debian.org/debian unstable/main amd64 debhelper all 13.24.1 [920 kB] Get: 45 http://deb.debian.org/debian unstable/main amd64 dh-python all 6.20250308 [115 kB] Get: 46 http://deb.debian.org/debian unstable/main amd64 python3-all amd64 3.13.2-2 [1044 B] Get: 47 http://deb.debian.org/debian unstable/main amd64 python3-autocommand all 2.2.2-3 [13.6 kB] Get: 48 http://deb.debian.org/debian unstable/main amd64 python3-more-itertools all 10.6.0-1 [65.3 kB] Get: 49 http://deb.debian.org/debian unstable/main amd64 python3-typing-extensions all 4.12.2-2 [73.0 kB] Get: 50 http://deb.debian.org/debian unstable/main amd64 python3-typeguard all 4.4.2-1 [37.3 kB] Get: 51 http://deb.debian.org/debian unstable/main amd64 python3-inflect all 7.3.1-2 [32.4 kB] Get: 52 http://deb.debian.org/debian unstable/main amd64 python3-iniconfig all 1.1.1-2 [6396 B] Get: 53 http://deb.debian.org/debian unstable/main amd64 python3-jaraco.functools all 4.1.0-1 [12.0 kB] Get: 54 http://deb.debian.org/debian unstable/main amd64 python3-pkg-resources all 75.8.0-1 [222 kB] Get: 55 http://deb.debian.org/debian unstable/main amd64 python3-jaraco.text all 4.0.0-1 [11.4 kB] Get: 56 http://deb.debian.org/debian unstable/main amd64 python3-zipp all 3.21.0-1 [10.6 kB] Get: 57 http://deb.debian.org/debian unstable/main amd64 python3-setuptools all 75.8.0-1 [724 kB] Get: 58 http://deb.debian.org/debian unstable/main amd64 python3-jaraco.context all 6.0.1-1 [8276 B] Get: 59 http://deb.debian.org/debian unstable/main amd64 python3-packaging all 24.2-1 [55.3 kB] Get: 60 http://deb.debian.org/debian unstable/main amd64 python3-pluggy all 1.5.0-1 [26.9 kB] Get: 61 http://deb.debian.org/debian unstable/main amd64 python3-pyparsing all 3.1.2-1 [146 kB] Get: 62 http://deb.debian.org/debian unstable/main amd64 python3-pytest all 8.3.5-1 [250 kB] Get: 63 http://deb.debian.org/debian unstable/main amd64 python3-rdflib all 7.1.1-2 [472 kB] Fetched 29.4 MB in 4s (8111 kB/s) Preconfiguring packages ... Selecting previously unselected package libpython3.13-minimal:amd64. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 19783 files and directories currently installed.) Preparing to unpack .../libpython3.13-minimal_3.13.2-1_amd64.deb ... Unpacking libpython3.13-minimal:amd64 (3.13.2-1) ... Selecting previously unselected package libexpat1:amd64. Preparing to unpack .../libexpat1_2.6.4-1_amd64.deb ... Unpacking libexpat1:amd64 (2.6.4-1) ... Selecting previously unselected package python3.13-minimal. Preparing to unpack .../python3.13-minimal_3.13.2-1_amd64.deb ... Unpacking python3.13-minimal (3.13.2-1) ... Setting up libpython3.13-minimal:amd64 (3.13.2-1) ... Setting up libexpat1:amd64 (2.6.4-1) ... Setting up python3.13-minimal (3.13.2-1) ... Selecting previously unselected package python3-minimal. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 20117 files and directories currently installed.) Preparing to unpack .../0-python3-minimal_3.13.2-2_amd64.deb ... Unpacking python3-minimal (3.13.2-2) ... Selecting previously unselected package media-types. Preparing to unpack .../1-media-types_13.0.0_all.deb ... Unpacking media-types (13.0.0) ... Selecting previously unselected package netbase. Preparing to unpack .../2-netbase_6.4_all.deb ... Unpacking netbase (6.4) ... Selecting previously unselected package tzdata. Preparing to unpack .../3-tzdata_2025a-2_all.deb ... Unpacking tzdata (2025a-2) ... Selecting previously unselected package libffi8:amd64. Preparing to unpack .../4-libffi8_3.4.7-1_amd64.deb ... Unpacking libffi8:amd64 (3.4.7-1) ... Selecting previously unselected package readline-common. Preparing to unpack .../5-readline-common_8.2-6_all.deb ... Unpacking readline-common (8.2-6) ... Selecting previously unselected package libreadline8t64:amd64. Preparing to unpack .../6-libreadline8t64_8.2-6_amd64.deb ... Adding 'diversion of /lib/x86_64-linux-gnu/libhistory.so.8 to /lib/x86_64-linux-gnu/libhistory.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/x86_64-linux-gnu/libhistory.so.8.2 to /lib/x86_64-linux-gnu/libhistory.so.8.2.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/x86_64-linux-gnu/libreadline.so.8 to /lib/x86_64-linux-gnu/libreadline.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/x86_64-linux-gnu/libreadline.so.8.2 to /lib/x86_64-linux-gnu/libreadline.so.8.2.usr-is-merged by libreadline8t64' Unpacking libreadline8t64:amd64 (8.2-6) ... Selecting previously unselected package libpython3.13-stdlib:amd64. Preparing to unpack .../7-libpython3.13-stdlib_3.13.2-1_amd64.deb ... Unpacking libpython3.13-stdlib:amd64 (3.13.2-1) ... Selecting previously unselected package python3.13. Preparing to unpack .../8-python3.13_3.13.2-1_amd64.deb ... Unpacking python3.13 (3.13.2-1) ... Selecting previously unselected package libpython3-stdlib:amd64. Preparing to unpack .../9-libpython3-stdlib_3.13.2-2_amd64.deb ... Unpacking libpython3-stdlib:amd64 (3.13.2-2) ... Setting up python3-minimal (3.13.2-2) ... Selecting previously unselected package python3. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 21127 files and directories currently installed.) Preparing to unpack .../00-python3_3.13.2-2_amd64.deb ... Unpacking python3 (3.13.2-2) ... Selecting previously unselected package sensible-utils. Preparing to unpack .../01-sensible-utils_0.0.24_all.deb ... Unpacking sensible-utils (0.0.24) ... Selecting previously unselected package libmagic-mgc. Preparing to unpack .../02-libmagic-mgc_1%3a5.45-3+b1_amd64.deb ... Unpacking libmagic-mgc (1:5.45-3+b1) ... Selecting previously unselected package libmagic1t64:amd64. Preparing to unpack .../03-libmagic1t64_1%3a5.45-3+b1_amd64.deb ... Unpacking libmagic1t64:amd64 (1:5.45-3+b1) ... Selecting previously unselected package file. Preparing to unpack .../04-file_1%3a5.45-3+b1_amd64.deb ... Unpacking file (1:5.45-3+b1) ... Selecting previously unselected package gettext-base. Preparing to unpack .../05-gettext-base_0.23.1-1_amd64.deb ... Unpacking gettext-base (0.23.1-1) ... Selecting previously unselected package libuchardet0:amd64. Preparing to unpack .../06-libuchardet0_0.0.8-1+b2_amd64.deb ... Unpacking libuchardet0:amd64 (0.0.8-1+b2) ... Selecting previously unselected package groff-base. Preparing to unpack .../07-groff-base_1.23.0-7_amd64.deb ... Unpacking groff-base (1.23.0-7) ... Selecting previously unselected package bsdextrautils. Preparing to unpack .../08-bsdextrautils_2.40.4-5_amd64.deb ... Unpacking bsdextrautils (2.40.4-5) ... Selecting previously unselected package libpipeline1:amd64. Preparing to unpack .../09-libpipeline1_1.5.8-1_amd64.deb ... Unpacking libpipeline1:amd64 (1.5.8-1) ... Selecting previously unselected package man-db. Preparing to unpack .../10-man-db_2.13.0-1_amd64.deb ... Unpacking man-db (2.13.0-1) ... Selecting previously unselected package m4. Preparing to unpack .../11-m4_1.4.19-7_amd64.deb ... Unpacking m4 (1.4.19-7) ... Selecting previously unselected package autoconf. Preparing to unpack .../12-autoconf_2.72-3_all.deb ... Unpacking autoconf (2.72-3) ... Selecting previously unselected package autotools-dev. Preparing to unpack .../13-autotools-dev_20220109.1_all.deb ... Unpacking autotools-dev (20220109.1) ... Selecting previously unselected package automake. Preparing to unpack .../14-automake_1%3a1.17-3_all.deb ... Unpacking automake (1:1.17-3) ... Selecting previously unselected package autopoint. Preparing to unpack .../15-autopoint_0.23.1-1_all.deb ... Unpacking autopoint (0.23.1-1) ... Selecting previously unselected package libdebhelper-perl. Preparing to unpack .../16-libdebhelper-perl_13.24.1_all.deb ... Unpacking libdebhelper-perl (13.24.1) ... Selecting previously unselected package libtool. Preparing to unpack .../17-libtool_2.5.4-4_all.deb ... Unpacking libtool (2.5.4-4) ... Selecting previously unselected package dh-autoreconf. Preparing to unpack .../18-dh-autoreconf_20_all.deb ... Unpacking dh-autoreconf (20) ... Selecting previously unselected package libarchive-zip-perl. Preparing to unpack .../19-libarchive-zip-perl_1.68-1_all.deb ... Unpacking libarchive-zip-perl (1.68-1) ... Selecting previously unselected package libfile-stripnondeterminism-perl. Preparing to unpack .../20-libfile-stripnondeterminism-perl_1.14.1-2_all.deb ... Unpacking libfile-stripnondeterminism-perl (1.14.1-2) ... Selecting previously unselected package dh-strip-nondeterminism. Preparing to unpack .../21-dh-strip-nondeterminism_1.14.1-2_all.deb ... Unpacking dh-strip-nondeterminism (1.14.1-2) ... Selecting previously unselected package libelf1t64:amd64. Preparing to unpack .../22-libelf1t64_0.192-4_amd64.deb ... Unpacking libelf1t64:amd64 (0.192-4) ... Selecting previously unselected package dwz. Preparing to unpack .../23-dwz_0.15-1+b1_amd64.deb ... Unpacking dwz (0.15-1+b1) ... Selecting previously unselected package libunistring5:amd64. Preparing to unpack .../24-libunistring5_1.3-1_amd64.deb ... Unpacking libunistring5:amd64 (1.3-1) ... Selecting previously unselected package libicu72:amd64. Preparing to unpack .../25-libicu72_72.1-6_amd64.deb ... Unpacking libicu72:amd64 (72.1-6) ... Selecting previously unselected package libxml2:amd64. Preparing to unpack .../26-libxml2_2.12.7+dfsg+really2.9.14-0.2+b2_amd64.deb ... Unpacking libxml2:amd64 (2.12.7+dfsg+really2.9.14-0.2+b2) ... Selecting previously unselected package gettext. Preparing to unpack .../27-gettext_0.23.1-1_amd64.deb ... Unpacking gettext (0.23.1-1) ... Selecting previously unselected package intltool-debian. Preparing to unpack .../28-intltool-debian_0.35.0+20060710.6_all.deb ... Unpacking intltool-debian (0.35.0+20060710.6) ... Selecting previously unselected package po-debconf. Preparing to unpack .../29-po-debconf_1.0.21+nmu1_all.deb ... Unpacking po-debconf (1.0.21+nmu1) ... Selecting previously unselected package debhelper. Preparing to unpack .../30-debhelper_13.24.1_all.deb ... Unpacking debhelper (13.24.1) ... Selecting previously unselected package dh-python. Preparing to unpack .../31-dh-python_6.20250308_all.deb ... Unpacking dh-python (6.20250308) ... Selecting previously unselected package python3-all. Preparing to unpack .../32-python3-all_3.13.2-2_amd64.deb ... Unpacking python3-all (3.13.2-2) ... Selecting previously unselected package python3-autocommand. Preparing to unpack .../33-python3-autocommand_2.2.2-3_all.deb ... Unpacking python3-autocommand (2.2.2-3) ... Selecting previously unselected package python3-more-itertools. Preparing to unpack .../34-python3-more-itertools_10.6.0-1_all.deb ... Unpacking python3-more-itertools (10.6.0-1) ... Selecting previously unselected package python3-typing-extensions. Preparing to unpack .../35-python3-typing-extensions_4.12.2-2_all.deb ... Unpacking python3-typing-extensions (4.12.2-2) ... Selecting previously unselected package python3-typeguard. Preparing to unpack .../36-python3-typeguard_4.4.2-1_all.deb ... Unpacking python3-typeguard (4.4.2-1) ... Selecting previously unselected package python3-inflect. Preparing to unpack .../37-python3-inflect_7.3.1-2_all.deb ... Unpacking python3-inflect (7.3.1-2) ... Selecting previously unselected package python3-iniconfig. Preparing to unpack .../38-python3-iniconfig_1.1.1-2_all.deb ... Unpacking python3-iniconfig (1.1.1-2) ... Selecting previously unselected package python3-jaraco.functools. Preparing to unpack .../39-python3-jaraco.functools_4.1.0-1_all.deb ... Unpacking python3-jaraco.functools (4.1.0-1) ... Selecting previously unselected package python3-pkg-resources. Preparing to unpack .../40-python3-pkg-resources_75.8.0-1_all.deb ... Unpacking python3-pkg-resources (75.8.0-1) ... Selecting previously unselected package python3-jaraco.text. Preparing to unpack .../41-python3-jaraco.text_4.0.0-1_all.deb ... Unpacking python3-jaraco.text (4.0.0-1) ... Selecting previously unselected package python3-zipp. Preparing to unpack .../42-python3-zipp_3.21.0-1_all.deb ... Unpacking python3-zipp (3.21.0-1) ... Selecting previously unselected package python3-setuptools. Preparing to unpack .../43-python3-setuptools_75.8.0-1_all.deb ... Unpacking python3-setuptools (75.8.0-1) ... Selecting previously unselected package python3-jaraco.context. Preparing to unpack .../44-python3-jaraco.context_6.0.1-1_all.deb ... Unpacking python3-jaraco.context (6.0.1-1) ... Selecting previously unselected package python3-packaging. Preparing to unpack .../45-python3-packaging_24.2-1_all.deb ... Unpacking python3-packaging (24.2-1) ... Selecting previously unselected package python3-pluggy. Preparing to unpack .../46-python3-pluggy_1.5.0-1_all.deb ... Unpacking python3-pluggy (1.5.0-1) ... Selecting previously unselected package python3-pyparsing. Preparing to unpack .../47-python3-pyparsing_3.1.2-1_all.deb ... Unpacking python3-pyparsing (3.1.2-1) ... Selecting previously unselected package python3-pytest. Preparing to unpack .../48-python3-pytest_8.3.5-1_all.deb ... Unpacking python3-pytest (8.3.5-1) ... Selecting previously unselected package python3-rdflib. Preparing to unpack .../49-python3-rdflib_7.1.1-2_all.deb ... Unpacking python3-rdflib (7.1.1-2) ... Setting up media-types (13.0.0) ... Setting up libpipeline1:amd64 (1.5.8-1) ... Setting up libicu72:amd64 (72.1-6) ... Setting up bsdextrautils (2.40.4-5) ... Setting up libmagic-mgc (1:5.45-3+b1) ... Setting up libarchive-zip-perl (1.68-1) ... Setting up libdebhelper-perl (13.24.1) ... Setting up libmagic1t64:amd64 (1:5.45-3+b1) ... Setting up gettext-base (0.23.1-1) ... Setting up m4 (1.4.19-7) ... Setting up file (1:5.45-3+b1) ... Setting up libelf1t64:amd64 (0.192-4) ... Setting up tzdata (2025a-2) ... Current default time zone: 'Etc/UTC' Local time is now: Tue Mar 11 10:42:39 UTC 2025. Universal Time is now: Tue Mar 11 10:42:39 UTC 2025. Run 'dpkg-reconfigure tzdata' if you wish to change it. Setting up autotools-dev (20220109.1) ... Setting up libunistring5:amd64 (1.3-1) ... Setting up autopoint (0.23.1-1) ... Setting up autoconf (2.72-3) ... Setting up libffi8:amd64 (3.4.7-1) ... Setting up dwz (0.15-1+b1) ... Setting up sensible-utils (0.0.24) ... Setting up libuchardet0:amd64 (0.0.8-1+b2) ... Setting up netbase (6.4) ... Setting up readline-common (8.2-6) ... Setting up libxml2:amd64 (2.12.7+dfsg+really2.9.14-0.2+b2) ... Setting up automake (1:1.17-3) ... update-alternatives: using /usr/bin/automake-1.17 to provide /usr/bin/automake (automake) in auto mode Setting up libfile-stripnondeterminism-perl (1.14.1-2) ... Setting up gettext (0.23.1-1) ... Setting up libtool (2.5.4-4) ... Setting up intltool-debian (0.35.0+20060710.6) ... Setting up dh-autoreconf (20) ... Setting up libreadline8t64:amd64 (8.2-6) ... Setting up dh-strip-nondeterminism (1.14.1-2) ... Setting up groff-base (1.23.0-7) ... Setting up libpython3.13-stdlib:amd64 (3.13.2-1) ... Setting up libpython3-stdlib:amd64 (3.13.2-2) ... Setting up python3.13 (3.13.2-1) ... Setting up po-debconf (1.0.21+nmu1) ... Setting up python3 (3.13.2-2) ... Setting up python3-zipp (3.21.0-1) ... Setting up python3-autocommand (2.2.2-3) ... Setting up man-db (2.13.0-1) ... Not building database; man-db/auto-update is not 'true'. Setting up python3-packaging (24.2-1) ... Setting up python3-pyparsing (3.1.2-1) ... Setting up python3-typing-extensions (4.12.2-2) ... Setting up python3-pluggy (1.5.0-1) ... Setting up python3-rdflib (7.1.1-2) ... Setting up dh-python (6.20250308) ... Setting up python3-more-itertools (10.6.0-1) ... Setting up python3-iniconfig (1.1.1-2) ... Setting up python3-jaraco.functools (4.1.0-1) ... Setting up python3-jaraco.context (6.0.1-1) ... Setting up python3-pytest (8.3.5-1) ... Setting up python3-typeguard (4.4.2-1) ... Setting up python3-all (3.13.2-2) ... Setting up debhelper (13.24.1) ... Setting up python3-inflect (7.3.1-2) ... Setting up python3-jaraco.text (4.0.0-1) ... Setting up python3-pkg-resources (75.8.0-1) ... Setting up python3-setuptools (75.8.0-1) ... Processing triggers for libc-bin (2.41-4) ... Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... Building tag database... -> Finished parsing the build-deps I: Building the package I: Running cd /build/reproducible-path/sparql-wrapper-python-2.0.0/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-genchanges -S > ../sparql-wrapper-python_2.0.0-2_source.changes dpkg-buildpackage: info: source package sparql-wrapper-python dpkg-buildpackage: info: source version 2.0.0-2 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Alexandre Detiste <tchet@debian.org> dpkg-source --before-build . dpkg-buildpackage: info: host architecture amd64 debian/rules clean dh clean --buildsystem=pybuild dh_auto_clean -O--buildsystem=pybuild I: pybuild base:311: python3.13 setup.py clean running clean removing '/build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build' (and everything under it) 'build/bdist.linux-x86_64' does not exist -- can't clean it 'build/scripts-3.13' does not exist -- can't clean it dh_autoreconf_clean -O--buildsystem=pybuild dh_clean -O--buildsystem=pybuild debian/rules binary dh binary --buildsystem=pybuild dh_update_autotools_config -O--buildsystem=pybuild dh_autoreconf -O--buildsystem=pybuild dh_auto_configure -O--buildsystem=pybuild I: pybuild base:311: python3.13 setup.py config running config dh_auto_build -O--buildsystem=pybuild I: pybuild base:311: /usr/bin/python3 setup.py build running build running build_py creating /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/__init__.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/Wrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/main.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/SmartWrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/sparql_dataframe.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper I: pybuild pybuild:334: cp -r test /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build debian/rules override_dh_auto_test make[1]: Entering directory '/build/reproducible-path/sparql-wrapper-python-2.0.0' # tests need a remote server dh_auto_test || : I: pybuild base:311: cd /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build; python3.13 -m pytest test ============================= test session starts ============================== platform linux -- Python 3.13.2, pytest-8.3.5, pluggy-1.5.0 rootdir: /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build configfile: pyproject.toml plugins: typeguard-4.4.2 collected 1525 items test/test_agrovoc-allegrograph_on_hold.py sFxxsFFsFFxsFFxxsFFFFxxsFFFFxx [ 1%] sFFFFxxsFFFFFFFFssFFFxxFFxFFxxFFF [ 4%] test/test_allegrograph__v4_14_1__mmi.py ssFFFFFFssFFFFssFFFFFFssFFFFFFss [ 6%] FFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFFFssFFFFFF [ 10%] FFFFFFFFFFFFFFFFFFFFFFF [ 12%] test/test_blazegraph__wikidata.py ssFFFFFFssFFFFssFFFFFFssFFFFFFsFsFsFFF [ 14%] sFFFFFFFsFsFsFFFsFFFFFFFsFsFsFFFsFFFFFFFsFsFsFFFsFFFFFFFsFFsFsFFFFFFFsFF [ 19%] FFFsFFFFFFFsFFFFF [ 20%] test/test_cli.py ..F...FFFFFFFFFFFFFFFFFFFFFF [ 22%] test/test_fuseki2__v3_6_0__agrovoc.py FFFsFFsFFFFFFFFFFsFFsFFFFFFFsFFFFF [ 24%] sFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFFFFFFsFFFFsFFs [ 29%] FFFFFFFFFFsFFsFFFFFFF [ 30%] test/test_fuseki2__v3_8_0__stw.py FFFsFFsFFFFFFFFFFsFFsFFFFFFFsFFFFFsFsF [ 33%] sFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFFFFFFsFFFFsFFsFFFF [ 38%] FFFFFFsFFsFFFFFFF [ 39%] test/test_graphdbEnterprise__v8_9_0__rs.py ssssFFsFsssFsFssssFFsFsssFsFs [ 41%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFFsFFsFsF [ 45%] ssFFsFsFsFsFsFssFFsFsFsFsF [ 47%] test/test_lov-fuseki_on_hold.py FFFFFFFFFFFFFFssssssssssssssFFFFFFFFFFFF [ 50%] FFFFssssssssssssssssFFFFFFFFFFFFFFFFssssssssssssssssFsFFssFFFFFFFFFFFFFF [ 54%] Fssssssssssssss [ 55%] test/test_rdf4j__geosciml.py ssssFFsFsssFsFssssFFsFsssFsFsFsFsFsFsFsFsFs [ 58%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFFssFFsFsFssFFsFsFsFsFs [ 63%] FssFFsFsFsFsF [ 64%] test/test_stardog__lindas.py ssssFFsFsssFsFssssFFsFsssFsFsFsFsFsFsFsFsFs [ 67%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFssFFFsFsFssFFsFsFsFsFs [ 71%] FssFFsFsFsFsF [ 72%] test/test_store__v1_1_4.py FFFsFFsFsFxFxFxxxxxxxxxxxxxxsFsssFsFsFsFxFxFx [ 75%] xssxxxxxxxxxxxxsFsssFsssFssxFxFxxssxxxxxxxxxxxxFFFFssFFFFsFFsFsFxFxFxxxx [ 80%] xxxxxxxxxx [ 81%] test/test_virtuoso__v7_20_3230__dbpedia.py FFFssFssFFFFFFsssssFsssssssss [ 82%] FFFssFFFFFFFFFFsFssssFssssssFsssFFFssFFFFFFFFFFssssssssssssssssFFFFssFFF [ 87%] FFFFssFFFFFFsssFFsssssssss [ 89%] test/test_virtuoso__v8_03_3313__dbpedia.py FFFssFssFFFFFFsssssssssssssss [ 91%] FFFssFFFFFFFFFFsssssssssssssssssFFFssFFFFFFFFFFssssssssssssssssFFFFFsFFF [ 96%] FFFFssFFFFFFssssssssssssss [ 97%] test/test_wrapper.py ....s..........................F... [100%] =================================== FAILURES =================================== ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02755550> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02756510> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testAskByGETinJSON> def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_agrovoc-allegrograph_on_hold.py:403: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02755550> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02756510> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c3360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b08050> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testAskByGETinUnknow> def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_agrovoc-allegrograph_on_hold.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c3360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b08050> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02b088a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b089d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testAskByGETinXML> def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:345: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02b088a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b089d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02bb2330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02bb1a30> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testAskByPOSTinJSON> def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) test/test_agrovoc-allegrograph_on_hold.py:410: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02bb2330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02bb1a30> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinJSONLD ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02b4a030> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b49590> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testAskByPOSTinJSONLD> def testAskByPOSTinJSONLD(self): > result = self.__generic(askQuery, JSONLD, POST) test/test_agrovoc-allegrograph_on_hold.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02b4a030> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b49590> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f024eed50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f024ef350> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow> def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) test/test_agrovoc-allegrograph_on_hold.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f024eed50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f024ef350> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f024ee350> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f024ecc50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testAskByPOSTinXML> def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) test/test_agrovoc-allegrograph_on_hold.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f024ee350> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f024ecc50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0232b310> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0232b3f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinN3> def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) test/test_agrovoc-allegrograph_on_hold.py:513: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0232b310> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0232b3f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02b5c2f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML> def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_agrovoc-allegrograph_on_hold.py:499: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02b5c2f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01ea13d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinUnknow> def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_agrovoc-allegrograph_on_hold.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01ea13d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02b59e50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinXML> def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:485: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02b59e50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02349e90> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd9b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testConstructByPOSTinN3> def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) test/test_agrovoc-allegrograph_on_hold.py:520: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02349e90> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd9b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234aaf0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0232ab30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testConstructByPOSTinRDFXML> def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) test/test_agrovoc-allegrograph_on_hold.py:506: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234aaf0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0232ab30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234a150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e270> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow> def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) test/test_agrovoc-allegrograph_on_hold.py:601: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234a150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e270> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234a2b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd1d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testConstructByPOSTinXML> def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) test/test_agrovoc-allegrograph_on_hold.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234a2b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd1d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234afc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023de890> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinN3> def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) test/test_agrovoc-allegrograph_on_hold.py:643: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234afc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023de890> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234b1d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023df230> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML> def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_agrovoc-allegrograph_on_hold.py:629: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234b1d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023df230> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f025b8f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5de10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow> def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_agrovoc-allegrograph_on_hold.py:724: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f025b8f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5de10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f024c20a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f770> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinXML> def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:615: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f024c20a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f770> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c54f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd7f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testDescribeByPOSTinN3> def testDescribeByPOSTinN3(self): > result = self.__generic(describeQuery, N3, POST) test/test_agrovoc-allegrograph_on_hold.py:650: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c54f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd7f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c78b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023def90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testDescribeByPOSTinRDFXML> def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) test/test_agrovoc-allegrograph_on_hold.py:636: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c78b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023def90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c6410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e890> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow> def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) test/test_agrovoc-allegrograph_on_hold.py:732: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c6410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e890> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c5910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML> def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) test/test_agrovoc-allegrograph_on_hold.py:622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c5910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c5a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023de970> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_agrovoc-allegrograph_on_hold.py:757: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c5a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023de970> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testQueryBadFormed> def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:742: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c6570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d8d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testQueryDuplicatedPrefix> def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:748: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c6570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d8d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c6990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ea50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:745: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c6990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ea50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234ba10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02230130> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:769: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234ba10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02230130> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234b960> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dc670> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinCSV> def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_agrovoc-allegrograph_on_hold.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234b960> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dc670> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234be30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cad0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinJSON> def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_agrovoc-allegrograph_on_hold.py:260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234be30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cad0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234b5f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02239fd0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinTSV> def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_agrovoc-allegrograph_on_hold.py:246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0234b5f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02239fd0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dc3d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinUnknow> def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_agrovoc-allegrograph_on_hold.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dc3d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022605d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02238d70> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV> def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) test/test_agrovoc-allegrograph_on_hold.py:239: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022605d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02238d70> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eeb0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON> def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) test/test_agrovoc-allegrograph_on_hold.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eeb0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260b50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0223bd90> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV> def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) test/test_agrovoc-allegrograph_on_hold.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260b50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0223bd90> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0223b690> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow> def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) test/test_agrovoc-allegrograph_on_hold.py:329: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0223b690> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0223b3f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_agrovoc-allegrograph_on_hold.SPARQLWrapperTests testMethod=testSelectByPOSTinXML> def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) test/test_agrovoc-allegrograph_on_hold.py:224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0223b3f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dfd90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByGETinJSON> def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_allegrograph__v4_14_1__mmi.py:572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02260940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dfd90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02262620> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02238590> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected> def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) test/test_allegrograph__v4_14_1__mmi.py:647: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02262620> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02238590> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02261650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f022409f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected_Conneg> def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:658: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02261650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f022409f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022622b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0223b770> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:579: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022622b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0223b770> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02263120> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ec10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected> def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) test/test_allegrograph__v4_14_1__mmi.py:603: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02263120> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ec10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c6570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected_Conneg> def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:614: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f023c6570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02262410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dfe70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByGETinUnknow> def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_allegrograph__v4_14_1__mmi.py:689: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02262410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dfe70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02263800> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f3f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:698: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02263800> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f3f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02262410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d8d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByGETinXML> def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:460: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02262410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d8d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02262360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e890> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:468: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02262360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e890> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd8d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByPOSTinJSON> def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) test/test_allegrograph__v4_14_1__mmi.py:586: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd8d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f230> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '515', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByPOSTinJSONLD_Unexpected> def testAskByPOSTinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, POST) test/test_allegrograph__v4_14_1__mmi.py:669: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f230> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '515', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dc670> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByPOSTinJSONLD_Unexpected_Conneg> def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:680: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dc670> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240e10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023df150> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByPOSTinJSON_Conneg> def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240e10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023df150> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02241230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023df770> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '475', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByPOSTinN3_Unexpected> def testAskByPOSTinN3_Unexpected(self): > result = self.__generic(askQuery, N3, POST) test/test_allegrograph__v4_14_1__mmi.py:625: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02241230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023df770> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '475', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022415a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd550> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByPOSTinN3_Unexpected_Conneg> def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:636: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022415a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd550> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02241a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c130> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow> def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) test/test_allegrograph__v4_14_1__mmi.py:707: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02241a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c130> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022420a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023deb30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow_Conneg> def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:716: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022420a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023deb30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023de5f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByPOSTinXML> def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) test/test_allegrograph__v4_14_1__mmi.py:476: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023de5f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022426d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cf30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testAskByPOSTinXML_Conneg> def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:484: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022426d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cf30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022409f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023ddc50> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected> def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) test/test_allegrograph__v4_14_1__mmi.py:885: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022409f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023ddc50> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02242c50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd010> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:894: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02242c50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dd010> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243120> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7def0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected> def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) test/test_allegrograph__v4_14_1__mmi.py:921: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243120> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7def0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022436a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dcd70> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected_Conneg> def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:930: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f022436a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f023dcd70> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d9b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinN3> def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) test/test_allegrograph__v4_14_1__mmi.py:823: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d9b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243ac0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ecf0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:830: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243ac0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ecf0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243c20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML> def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_allegrograph__v4_14_1__mmi.py:760: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243c20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f310> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:768: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f310> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8c7e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d74d70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinUnknow> def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_allegrograph__v4_14_1__mmi.py:956: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8c7e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d74d70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8d440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7eeb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:964: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8d440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7eeb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8d230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7eb30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinXML> def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8d230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7eb30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8cf70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d75b70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:738: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8cf70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d75b70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fe70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinCSV_Unexpected> def testConstructByPOSTinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, POST) test/test_allegrograph__v4_14_1__mmi.py:903: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02240730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fe70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243c20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e890> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinCSV_Unexpected_Conneg> def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02243c20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e890> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8c940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dc50> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '665', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinJSON_Unexpected> def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) test/test_allegrograph__v4_14_1__mmi.py:939: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8c940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dc50> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '665', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8dc80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinJSON_Unexpected_Conneg> def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:948: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8dc80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8c890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f770> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '659', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinN3> def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) test/test_allegrograph__v4_14_1__mmi.py:837: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8c890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f770> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '659', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8d650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e350> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinN3_Conneg> def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:844: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8d650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e350> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8e8e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cad0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '768', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinRDFXML> def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) test/test_allegrograph__v4_14_1__mmi.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8e8e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cad0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '768', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8ea40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f770> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinRDFXML_Conneg> def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:784: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8ea40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f770> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8ee60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e0b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow> def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) test/test_allegrograph__v4_14_1__mmi.py:972: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8ee60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e0b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8f330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e0b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow_Conneg> def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:980: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8f330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e0b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8dff0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d0f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinXML> def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) test/test_allegrograph__v4_14_1__mmi.py:745: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8dff0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d0f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8fa10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testConstructByPOSTinXML_Conneg> def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:752: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8fa10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8e410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f230> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected> def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) test/test_allegrograph__v4_14_1__mmi.py:1148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8e410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f230> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8dc80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d390> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1158: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d8dc80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d390> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e503c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d74910> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected> def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) test/test_allegrograph__v4_14_1__mmi.py:1185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e503c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d74910> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e509f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cc90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected_Conneg> def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e509f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cc90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e50260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d74130> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinN3> def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) test/test_allegrograph__v4_14_1__mmi.py:1086: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e50260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d74130> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e50680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5db70> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinN3_Conneg> def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1093: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e50680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5db70> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e51650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d75470> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML> def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_allegrograph__v4_14_1__mmi.py:1023: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e51650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d75470> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e517b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f3f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1031: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e517b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f3f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e51bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d77d90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow> def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_allegrograph__v4_14_1__mmi.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e51bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d77d90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e520a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cf30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e520a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cf30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e50d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d77310> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinXML> def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:994: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e50d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d77310> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e52780> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01e98c90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1001: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e52780> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01e98c90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e52990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d76f90> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinCSV_Unexpected> def testDescribeByPOSTinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, POST) test/test_allegrograph__v4_14_1__mmi.py:1167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e52990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d76f90> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e52c50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d2b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinCSV_Unexpected_Conneg> def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1176: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e52c50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d2b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e517b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '468', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSON_Unexpected> def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) test/test_allegrograph__v4_14_1__mmi.py:1203: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e517b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '468', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e51230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c750> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSON_Unexpected_Conneg> def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e51230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c750> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e53540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7de10> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '462', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinN3> def testDescribeByPOSTinN3(self): > result = self.__generic(describeQuery, N3, POST) test/test_allegrograph__v4_14_1__mmi.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e53540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7de10> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '462', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e53750> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e190> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinN3_Conneg> def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e53750> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e190> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e53280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dd30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '571', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinRDFXML> def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) test/test_allegrograph__v4_14_1__mmi.py:1039: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e53280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dd30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '571', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e517b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinRDFXML_Conneg> def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1047: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01e517b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e190> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow> def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) test/test_allegrograph__v4_14_1__mmi.py:1236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e190> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dfd0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow_Conneg> def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dfd0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98ec0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ce50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML> def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) test/test_allegrograph__v4_14_1__mmi.py:1008: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98ec0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ce50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d983c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML_Conneg> def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:1015: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d983c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_allegrograph__v4_14_1__mmi.py:1269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testQueryBadFormed> def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:1254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d992e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testQueryDuplicatedPrefix> def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:1260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d992e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d989f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c670> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d989f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c670> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d999c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d757f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:1281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d999c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d757f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d99e90> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d7f0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinCSV> def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_allegrograph__v4_14_1__mmi.py:245: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d99e90> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d7f0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d99650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ce50> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:252: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d99650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ce50> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9a410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c3d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinJSON> def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_allegrograph__v4_14_1__mmi.py:301: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9a410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c3d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9a6d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d75010> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected> def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) test/test_allegrograph__v4_14_1__mmi.py:376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9a6d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d75010> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9a8e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d75e10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:387: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9a8e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d75e10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5da90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5da90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9afc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d76430> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected> def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) test/test_allegrograph__v4_14_1__mmi.py:332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9afc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d76430> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9b280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e890> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected_Conneg> def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:343: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9b280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e890> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9b540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d74ad0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinTSV> def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_allegrograph__v4_14_1__mmi.py:273: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9b540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d74ad0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f070> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:280: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f070> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02000c90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinUnknow> def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_allegrograph__v4_14_1__mmi.py:418: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02000c90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9b6a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02001b70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:427: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9b6a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02001b70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d992e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinXML> def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:213: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d992e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fe70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d98940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fe70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d999c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f310> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '664', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV> def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) test/test_allegrograph__v4_14_1__mmi.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d999c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f310> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '664', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9b540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV_Conneg> def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01d9b540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c4cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d8d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '487', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON> def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) test/test_allegrograph__v4_14_1__mmi.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c4cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d8d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '487', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c4e10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected> def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) test/test_allegrograph__v4_14_1__mmi.py:398: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c4e10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c4c00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dd30> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected_Conneg> def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c4c00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dd30> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c4100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e430> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON_Conneg> def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:322: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c4100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e430> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c5230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinN3_Unexpected> def testSelectByPOSTinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, POST) test/test_allegrograph__v4_14_1__mmi.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c5230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c55a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d550> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinN3_Unexpected_Conneg> def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c55a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d550> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c41b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '764', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV> def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) test/test_allegrograph__v4_14_1__mmi.py:287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c41b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '764', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c5c80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c2f0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV_Conneg> def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c5c80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c2f0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c5f40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow> def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) test/test_allegrograph__v4_14_1__mmi.py:436: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c5f40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c6200> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow_Conneg> def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:445: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c6200> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c6410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinXML> def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) test/test_allegrograph__v4_14_1__mmi.py:229: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c6410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c66d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020004b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_allegrograph__v4_14_1__mmi.SPARQLWrapperTests testMethod=testSelectByPOSTinXML_Conneg> def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_allegrograph__v4_14_1__mmi.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c66d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020004b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c5bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02002eb0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByGETinJSON> def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_blazegraph__wikidata.py:580: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c5bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02002eb0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c7c20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d710> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected> def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) test/test_blazegraph__wikidata.py:655: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c7c20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d710> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c7cd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02000f30> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected_Conneg> def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:666: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c7cd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02000f30> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c7280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c7280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c7e30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02000bb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected> def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) test/test_blazegraph__wikidata.py:611: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c7e30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02000bb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c7b70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02003cb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected_Conneg> def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c7b70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02003cb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c76a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d2b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByGETinUnknow> def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_blazegraph__wikidata.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c76a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d2b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5c100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02002b30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_blazegraph__wikidata.py:706: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5c100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02002b30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d390> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d7f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByGETinXML> def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_blazegraph__wikidata.py:484: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d390> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d7f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020a4c90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020a4c90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5ccb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020a5c50> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '423', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByPOSTinJSON> def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) test/test_blazegraph__wikidata.py:594: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5ccb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020a5c50> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '423', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c5bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e270> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '457', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByPOSTinJSONLD_Unexpected> def testAskByPOSTinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, POST) test/test_blazegraph__wikidata.py:677: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c5bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e270> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '457', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c55a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d9b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByPOSTinJSONLD_Unexpected_Conneg> def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:688: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020c55a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d9b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d7b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d710> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByPOSTinJSON_Conneg> def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:601: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d7b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d710> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d390> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c9f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '417', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByPOSTinN3_Unexpected> def testAskByPOSTinN3_Unexpected(self): > result = self.__generic(askQuery, N3, POST) test/test_blazegraph__wikidata.py:633: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d390> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c9f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '417', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5e830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ecf0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByPOSTinN3_Unexpected_Conneg> def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:644: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5e830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ecf0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d4f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow> def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) test/test_blazegraph__wikidata.py:715: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d4f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5f070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c670> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow_Conneg> def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_blazegraph__wikidata.py:724: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5f070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c670> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5f800> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f070> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByPOSTinXML> def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) test/test_blazegraph__wikidata.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5f800> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f070> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5fac0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ec10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testAskByPOSTinXML_Conneg> def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:508: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5fac0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ec10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5fb70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c670> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected> def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) test/test_blazegraph__wikidata.py:915: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5fb70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c670> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5fee0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fcb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:924: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5fee0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fcb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5fd80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ecf0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD_Conneg> def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:887: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5fd80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ecf0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5e570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cc90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected_Conneg> def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:962: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5e570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cc90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5ee60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e6d0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:849: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5ee60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e6d0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5e150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d710> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML> def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_blazegraph__wikidata.py:768: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5e150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d710> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcc890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcc890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcc050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dd30> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE_Conneg> def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:811: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcc050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dd30> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcd230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c130> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinUnknow> def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_blazegraph__wikidata.py:990: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcd230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c130> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcde90> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020a6f90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_blazegraph__wikidata.py:998: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcde90> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020a6f90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e350> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinXML> def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_blazegraph__wikidata.py:739: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e350> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020a4d70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:746: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020a4d70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce4c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f3f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinCSV_Unexpected> def testConstructByPOSTinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, POST) test/test_blazegraph__wikidata.py:933: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce4c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f3f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020a7e70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinCSV_Unexpected_Conneg> def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:942: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f020a7e70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce6d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bfd550> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinJSONLD_Conneg> def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:906: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce6d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bfd550> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcdd30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f5b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinJSON_Unexpected_Conneg> def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:982: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcdd30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f5b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcdf40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bfc750> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinN3_Conneg> def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:868: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcdf40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bfc750> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d390> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e350> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '755', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinRDFXML> def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) test/test_blazegraph__wikidata.py:784: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01b5d390> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e350> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '755', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bccd60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c4b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinRDFXML_Conneg> def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:792: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bccd60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c4b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcf280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eb30> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinTURTLE_Conneg> def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:830: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcf280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eb30> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcfac0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f310> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow> def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) test/test_blazegraph__wikidata.py:1006: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcfac0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f310> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bceaf0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow_Conneg> def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1014: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bceaf0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcf490> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinXML> def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) test/test_blazegraph__wikidata.py:753: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcf490> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcfee0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5edd0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testConstructByPOSTinXML_Conneg> def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:760: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcfee0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5edd0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bccd60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e190> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected> def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) test/test_blazegraph__wikidata.py:1200: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bccd60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e190> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fe70> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1210: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bce410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fe70> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcd650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cf30> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD_Conneg> def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcd650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cf30> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02060470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d630> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected_Conneg> def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02060470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d630> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02060050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinN3_Conneg> def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1138: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02060050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020601b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e190> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML> def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_blazegraph__wikidata.py:1057: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020601b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e190> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020622b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ec10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1065: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020622b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ec10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02061860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bfd710> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE_Conneg> def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02061860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bfd710> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e510> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow> def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_blazegraph__wikidata.py:1276: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e510> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062db0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062db0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062e60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinXML> def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_blazegraph__wikidata.py:1028: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062e60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020631d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bfdd30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:1035: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020631d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bfdd30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020633e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ef90> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinCSV_Unexpected> def testDescribeByPOSTinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, POST) test/test_blazegraph__wikidata.py:1219: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020633e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ef90> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020636a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bff150> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinCSV_Unexpected_Conneg> def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f020636a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bff150> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02063540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd4830> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSONLD_Conneg> def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02063540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd4830> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062a40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bffcb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSON_Unexpected_Conneg> def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062a40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bffcb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02063e30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bfeeb0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinN3_Conneg> def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1157: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02063e30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bfeeb0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062fc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bff690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinRDFXML> def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) test/test_blazegraph__wikidata.py:1073: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062fc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01bff690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02063a10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7def0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinRDFXML_Conneg> def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1081: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02063a10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7def0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a08260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd65f0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinTURTLE_Conneg> def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1119: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a08260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd65f0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02061860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow> def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) test/test_blazegraph__wikidata.py:1292: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02061860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062db0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f150> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow_Conneg> def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f02062db0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f150> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcd650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML> def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) test/test_blazegraph__wikidata.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01bcd650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a08d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5de10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML_Conneg> def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:1049: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a08d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5de10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a09c80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e890> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_blazegraph__wikidata.py:1328: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a09c80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e890> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testQueryBadFormed> def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_blazegraph__wikidata.py:1310: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a09020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e7b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_blazegraph__wikidata.py:1313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a09020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e7b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a08d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c4b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testQueryWithComma_1> def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_blazegraph__wikidata.py:1332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a08d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c4b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a099c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c9f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_blazegraph__wikidata.py:1341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a099c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c9f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a09230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c4b0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a09230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c4b0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a08aa0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5db70> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinJSON> def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_blazegraph__wikidata.py:325: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a08aa0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5db70> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b3e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ef90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected> def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) test/test_blazegraph__wikidata.py:400: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b3e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ef90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b6a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ce50> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b6a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ce50> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b960> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d550> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b960> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d550> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0ba10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f4d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected> def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) test/test_blazegraph__wikidata.py:356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0ba10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f4d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0bd80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd5010> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected_Conneg> def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:367: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0bd80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd5010> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd5630> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:301: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd5630> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0bd80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinUnknow> def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_blazegraph__wikidata.py:442: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0bd80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b5f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd4830> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_blazegraph__wikidata.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b5f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd4830> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0aa40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f930> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinXML> def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_blazegraph__wikidata.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0aa40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f930> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0be30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd50f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_blazegraph__wikidata.py:233: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0be30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd50f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a083c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd7930> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV_Conneg> def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a083c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd7930> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01989700> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ce50> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '442', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON> def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) test/test_blazegraph__wikidata.py:339: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01989700> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ce50> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '442', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01989650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941010> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '476', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected> def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) test/test_blazegraph__wikidata.py:422: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01989650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941010> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '476', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01988260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cd70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected_Conneg> def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01988260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cd70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01989c80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940830> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON_Conneg> def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01989c80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940830> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5de10> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinN3_Unexpected> def testSelectByPOSTinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, POST) test/test_blazegraph__wikidata.py:378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a0b280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5de10> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a09020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e270> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinN3_Unexpected_Conneg> def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:389: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01a09020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e270> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01989180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f026fe270> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV_Conneg> def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:318: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01989180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f026fe270> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01988b50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d7f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow> def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) test/test_blazegraph__wikidata.py:460: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01988b50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d7f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0198aa40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow_Conneg> def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_blazegraph__wikidata.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0198aa40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0198ac50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c670> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinXML> def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) test/test_blazegraph__wikidata.py:241: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0198ac50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c670> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0198afc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fd90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_blazegraph__wikidata.SPARQLWrapperTests testMethod=testSelectByPOSTinXML_Conneg> def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_blazegraph__wikidata.py:249: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0198afc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fd90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLIParser_Test.testInvalidFormat _________________ self = <test.test_cli.SPARQLWrapperCLIParser_Test testMethod=testInvalidFormat> def testInvalidFormat(self): with self.assertRaises(SystemExit) as cm: parse_args(["-Q", testquery, "-F", "jjssoonn"]) self.assertEqual(cm.exception.code, 2) > self.assertEqual( sys.stderr.getvalue().split("\n")[1], "rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rdf+xml', 'csv', 'tsv', 'json-ld')", ) E AssertionError: "rqw:[65 chars]from json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld)" != "rqw:[65 chars]from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rd[28 chars]ld')" E - rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld) E + rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rdf+xml', 'csv', 'tsv', 'json-ld') E ? + + + + + + + + + + + + + + + + + + test/test_cli.py:79: AssertionError ______________________ SPARQLWrapperCLI_Test.testQueryRDF ______________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191a620>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02f72900> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryRDF> def testQueryRDF(self): > main(["-Q", "DESCRIBE <http://ja.wikipedia.org/wiki/SPARQL>", "-e", endpoint, "-F", "rdf"]) test/test_cli.py:249: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191a620>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02f72900> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperCLI_Test.testQueryTo4store ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0198b800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02859a90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryTo4store> def testQueryTo4store(self): > main(["-e", "http://rdf.chise.org/sparql", "-Q", testquery]) test/test_cli.py:627: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0198b800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02859a90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperCLI_Test.testQueryToAgrovoc_AllegroGraph _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f019894f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5da90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToAgrovoc_AllegroGraph> def testQueryToAgrovoc_AllegroGraph(self): > main(["-e", "https://agrovoc.fao.org/sparql", "-Q", testquery]) test/test_cli.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f019894f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5da90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLI_Test.testQueryToAllegroGraph _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191a780> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e270> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToAllegroGraph> def testQueryToAllegroGraph(self): > main(["-e", "https://mmisw.org/sparql", "-Q", testquery]) test/test_cli.py:378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191a780> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e270> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToBrazeGraph __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191a150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd4130> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToBrazeGraph> def testQueryToBrazeGraph(self): > main(["-e", "https://query.wikidata.org/sparql", "-Q", testquery]) test/test_cli.py:546: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191a150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd4130> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToFuseki2V3_6 _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01919d30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToFuseki2V3_6> def testQueryToFuseki2V3_6(self): > main(["-e", "https://agrovoc.uniroma2.it/sparql/", "-Q", testquery]) test/test_cli.py:573: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01919d30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToFuseki2V3_8 _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01919a70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0285b250> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToFuseki2V3_8> def testQueryToFuseki2V3_8(self): > main(["-e", "http://zbw.eu/beta/sparql/stw/query", "-Q", testquery]) test/test_cli.py:600: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01919a70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0285b250> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperCLI_Test.testQueryToGraphDBEnterprise ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01919f40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01e61480> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToGraphDBEnterprise> def testQueryToGraphDBEnterprise(self): > main(["-e", "http://factforge.net/repositories/ff-news", "-Q", testquery]) test/test_cli.py:405: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01919f40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01e61480> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryToLovFuseki __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01919d30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd6270> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToLovFuseki> def testQueryToLovFuseki(self): > main(["-e", "https://lov.linkeddata.es/dataset/lov/sparql/", "-Q", testquery]) test/test_cli.py:317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01919d30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd6270> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperCLI_Test.testQueryToRDF4J ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01919390>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01e63230> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToRDF4J> def testQueryToRDF4J(self): > main( [ "-e", "http://vocabs.ands.org.au/repository/api/sparql/csiro_international-chronostratigraphic-chart_2018-revised-corrected", "-Q", testquery, ] ) test/test_cli.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01919390>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01e63230> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperCLI_Test.testQueryToStardog ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01918f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd74d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '102', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToStardog> def testQueryToStardog(self): > main(["-e", "https://lindas.admin.ch/query", "-Q", testquery, "-m", POST]) test/test_cli.py:432: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01918f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01cd74d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '102', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToVirtuosoV7 __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01918c00>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0286a210> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToVirtuosoV7> def testQueryToVirtuosoV7(self): > main(["-e", "http://dbpedia.org/sparql", "-Q", testquery]) test/test_cli.py:516: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01918c00>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0286a210> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToVirtuosoV8 __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01919020>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f023f05a0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryToVirtuosoV8> def testQueryToVirtuosoV8(self): > main(["-e", "http://dbpedia-live.openlinksw.com/sparql", "-Q", testquery]) test/test_cli.py:486: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01919020>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f023f05a0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryWithEndpoint __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01918680>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b4af10> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryWithEndpoint> def testQueryWithEndpoint(self): > main( [ "-Q", testquery, "-e", endpoint, ] ) test/test_cli.py:97: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01918680>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b4af10> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperCLI_Test.testQueryWithFile ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01918260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01aac350> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryWithFile> def testQueryWithFile(self): > main(["-f", testfile, "-e", endpoint]) test/test_cli.py:135: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01918260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01aac350> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileCSV __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191a2b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f023b8450> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryWithFileCSV> def testQueryWithFileCSV(self): > main(["-f", testfile, "-e", endpoint, "-F", "csv"]) test/test_cli.py:291: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191a2b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f023b8450> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileN3 ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191a990>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01a4db80> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryWithFileN3> def testQueryWithFileN3(self): > main(["-f", testfile, "-e", endpoint, "-F", "n3"]) test/test_cli.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191a990>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01a4db80> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLI_Test.testQueryWithFileRDFXML _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191aba0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01a4fb60> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryWithFileRDFXML> def testQueryWithFileRDFXML(self): > main(["-f", testfile, "-e", endpoint, "-F", "rdf+xml"]) test/test_cli.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191aba0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01a4fb60> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileTSV __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191ae60>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d630> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryWithFileTSV> def testQueryWithFileTSV(self): > main(["-f", testfile, "-e", endpoint, "-F", "tsv"]) test/test_cli.py:304: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191ae60>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d630> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLI_Test.testQueryWithFileTurtle _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191b120>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941710> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryWithFileTurtle> def testQueryWithFileTurtle(self): > main(["-f", testfile, "-e", endpoint, "-F", "turtle"]) test/test_cli.py:188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0191b120>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941710> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperCLI_Test.testQueryWithFileTurtleQuiet ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01919ff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryWithFileTurtleQuiet> def testQueryWithFileTurtleQuiet(self): > main( [ "-f", testfile, "-e", endpoint, "-F", "turtle", "-q", ] ) test/test_cli.py:205: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01919ff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileXML __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01918260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ea50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_cli.SPARQLWrapperCLI_Test testMethod=testQueryWithFileXML> def testQueryWithFileXML(self): > main(["-f", testfile, "-e", endpoint, "-F", "xml"]) test/test_cli.py:167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01918260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ea50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f019894f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c9f0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinCSV> def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_fuseki2__v3_6_0__agrovoc.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f019894f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c9f0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191aa40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e510> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinCSV_Conneg> def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:496: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191aa40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e510> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191a410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f070> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinJSON> def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_fuseki2__v3_6_0__agrovoc.py:545: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191a410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f070> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191a990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ef90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected_Conneg> def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:629: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191a990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ef90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191b280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5da90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:552: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191b280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5da90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191adb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f690> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected_Conneg> def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191adb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f690> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191a830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f070> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinTSV> def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) test/test_fuseki2__v3_6_0__agrovoc.py:517: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191a830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f070> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191b070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f690> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinTSV_Conneg> def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:524: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191b070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f690> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191adb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7eb30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinUnknow> def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_fuseki2__v3_6_0__agrovoc.py:658: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191adb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7eb30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191b490> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fa10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191b490> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fa10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191b800> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f150> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinXML> def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:457: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0191b800> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f150> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01919180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ea50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:465: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01919180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ea50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinCSV _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinCSV> def testAskByPOSTinCSV(self): > result = self.__generic(askQuery, CSV, POST) test/test_fuseki2__v3_6_0__agrovoc.py:503: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinCSV_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019404b0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinCSV_Conneg> def testAskByPOSTinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:510: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019404b0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942190> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '339', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinJSON> def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) test/test_fuseki2__v3_6_0__agrovoc.py:559: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942190> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '339', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8c00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019433f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinJSONLD_Unexpected_Conneg> def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:650: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8c00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019433f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinJSON_Conneg> def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:566: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b95a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019426d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinN3_Unexpected_Conneg> def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:608: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b95a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019426d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinTSV _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943cb0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinTSV> def testAskByPOSTinTSV(self): > result = self.__generic(askQuery, TSV, POST) test/test_fuseki2__v3_6_0__agrovoc.py:531: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943cb0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinTSV_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b9860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943230> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinTSV_Conneg> def testAskByPOSTinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:538: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b9860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943230> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ba410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120590> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow> def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) test/test_fuseki2__v3_6_0__agrovoc.py:676: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ba410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120590> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ba6d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120f30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow_Conneg> def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:685: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ba6d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120f30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ba8e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019420b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinXML> def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) test/test_fuseki2__v3_6_0__agrovoc.py:473: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ba8e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019420b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021baba0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121e10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testAskByPOSTinXML_Conneg> def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:481: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021baba0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121e10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021bad00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7eb30> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:874: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021bad00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7eb30> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8e10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fe70> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD> def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) test/test_fuseki2__v3_6_0__agrovoc.py:831: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8e10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fe70> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ba200> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dc50> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD_Conneg> def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:839: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ba200> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dc50> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f230> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected> def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) test/test_fuseki2__v3_6_0__agrovoc.py:901: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b8100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f230> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021bb280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f5b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected_Conneg> def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:910: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021bb280> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f5b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b9020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ec10> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:806: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b9020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ec10> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b9a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5db70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:738: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021b9a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5db70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021bb330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE_Conneg> def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:772: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021bb330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021bad00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e510> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinUnknow> def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_fuseki2__v3_6_0__agrovoc.py:935: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021bad00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e510> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:943: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f150> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinXML> def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:700: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f150> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8b50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:707: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8b50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a83c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e0b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinCSV_Unexpected_Conneg> def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:893: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a83c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e0b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinJSONLD ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '702', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinJSONLD> def testConstructByPOSTinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, POST) test/test_fuseki2__v3_6_0__agrovoc.py:847: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '702', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940590> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinJSONLD_Conneg> def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:855: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940590> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a9910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '527', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinJSON_Unexpected> def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) test/test_fuseki2__v3_6_0__agrovoc.py:918: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a9910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '527', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a9bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942c10> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinJSON_Unexpected_Conneg> def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:927: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a9bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942c10> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e510> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinN3_Conneg> def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:823: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a8730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e510> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a9230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941010> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinRDFXML_Conneg> def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:755: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a9230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941010> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a9650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940750> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinTURTLE_Conneg> def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a9650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940750> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021aa570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow> def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) test/test_fuseki2__v3_6_0__agrovoc.py:951: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021aa570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021aaba0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow_Conneg> def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:959: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021aaba0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ab3e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7edd0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinXML> def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) test/test_fuseki2__v3_6_0__agrovoc.py:714: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ab3e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7edd0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ab6a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019427b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testConstructByPOSTinXML_Conneg> def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:721: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ab6a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019427b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ab070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1143: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ab070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021abee0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943bd0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD> def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021abee0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943bd0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021aa410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122510> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD_Conneg> def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1110: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021aa410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122510> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ba200> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eb30> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected> def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1170: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ba200> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eb30> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ab070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f310> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected_Conneg> def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1179: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021ab070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f310> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a9650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinN3_Conneg> def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1079: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f021a9650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f08890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940bb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1011: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f08890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940bb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f08470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941630> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE_Conneg> def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1045: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f08470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941630> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f09440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow> def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_fuseki2__v3_6_0__agrovoc.py:1204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f09440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f092e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e510> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f092e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e510> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f09180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c4b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinXML> def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:973: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f09180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c4b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f09700> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:980: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f09700> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f09440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943930> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinCSV_Unexpected_Conneg> def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f09440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943930> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f08f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d550> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '501', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSONLD> def testDescribeByPOSTinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, POST) test/test_fuseki2__v3_6_0__agrovoc.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f08f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d550> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '501', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f08b50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943770> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSONLD_Conneg> def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f08b50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943770> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0a410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '326', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSON_Unexpected> def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) test/test_fuseki2__v3_6_0__agrovoc.py:1187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0a410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '326', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0a6d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019420b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSON_Unexpected_Conneg> def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0a6d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019420b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f081b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d550> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinN3_Conneg> def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1096: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f081b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d550> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0a150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinRDFXML_Conneg> def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1028: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0a150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0b070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d0f0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinTURTLE_Conneg> def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1062: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0b070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d0f0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0ae60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow> def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) test/test_fuseki2__v3_6_0__agrovoc.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0ae60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f081b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow_Conneg> def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f081b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML> def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) test/test_fuseki2__v3_6_0__agrovoc.py:987: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0bc20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f770> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML_Conneg> def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:994: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f0bc20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f770> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123930> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_fuseki2__v3_6_0__agrovoc.py:1253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123930> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testQueryBadFormed> def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1238: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cbb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testQueryDuplicatedPrefix> def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cbb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d470> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1241: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d470> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164cec0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120910> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testQueryWithComma_1> def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164cec0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120910> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c9f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121390> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c9f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121390> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f08470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e510> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinCSV> def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_fuseki2__v3_6_0__agrovoc.py:246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f08470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e510> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f092e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f230> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01f092e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f230> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164d860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dfd0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinJSON> def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_fuseki2__v3_6_0__agrovoc.py:302: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164d860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dfd0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164ccb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c910> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164ccb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c910> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e0b0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:309: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e0b0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164dc80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019420b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected_Conneg> def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164dc80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019420b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cbb0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinTSV> def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_fuseki2__v3_6_0__agrovoc.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cbb0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164d860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d8d0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164d860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d8d0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164e830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943a10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinUnknow> def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_fuseki2__v3_6_0__agrovoc.py:415: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164e830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943a10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164e990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dfd0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:424: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164e990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dfd0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164ec50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941e10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinXML> def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164ec50> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941e10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164ee60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940210> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164ee60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940210> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942190> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '466', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV> def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) test/test_fuseki2__v3_6_0__agrovoc.py:260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164c940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942190> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '466', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164f3e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV_Conneg> def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164f3e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c210> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164f540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940670> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '393', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON> def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) test/test_fuseki2__v3_6_0__agrovoc.py:316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164f540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940670> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '393', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164e410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e0b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected_Conneg> def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164e410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e0b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164fd80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940910> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON_Conneg> def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0164fd80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940910> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f017041b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dd30> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinN3_Unexpected_Conneg> def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f017041b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dd30> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f017047e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ea50> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '566', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV> def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) test/test_fuseki2__v3_6_0__agrovoc.py:288: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f017047e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ea50> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '566', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01705180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV_Conneg> def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:295: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01705180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01704f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943bd0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow> def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) test/test_fuseki2__v3_6_0__agrovoc.py:433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01704f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943bd0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01704d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f150> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow_Conneg> def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:442: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01704d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f150> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01705440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123bd0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinXML> def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) test/test_fuseki2__v3_6_0__agrovoc.py:230: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01705440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123bd0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01705650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123930> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_6_0__agrovoc.SPARQLWrapperTests testMethod=testSelectByPOSTinXML_Conneg> def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_6_0__agrovoc.py:238: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01705650> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123930> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01705d30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5dd30> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinCSV> def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_fuseki2__v3_8_0__stw.py:493: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01705d30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5dd30> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f017041b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021233f0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinCSV_Conneg> def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f017041b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021233f0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0164c730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019434d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinJSON> def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_fuseki2__v3_8_0__stw.py:549: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0164c730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019434d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01706200>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019435b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected_Conneg> def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:633: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01706200>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019435b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01705ff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019433f0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01705ff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019433f0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01706620>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941010> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected_Conneg> def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:591: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01706620>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941010> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01706360>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940210> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinTSV> def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) test/test_fuseki2__v3_8_0__stw.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01706360>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940210> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f017047e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942190> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinTSV_Conneg> def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:528: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f017047e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942190> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707070>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940e50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinUnknow> def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_fuseki2__v3_8_0__stw.py:662: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707070>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940e50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707280>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941a90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:671: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707280>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941a90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707540>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e190> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinXML> def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_fuseki2__v3_8_0__stw.py:461: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707540>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e190> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d9b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d9b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinCSV _____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707a10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940bb0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinCSV> def testAskByPOSTinCSV(self): > result = self.__generic(askQuery, CSV, POST) test/test_fuseki2__v3_8_0__stw.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707a10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940bb0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinCSV_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707ee0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d550> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinCSV_Conneg> def testAskByPOSTinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:514: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707ee0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d550> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707c20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019435b0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '333', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinJSON> def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) test/test_fuseki2__v3_8_0__stw.py:563: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01707c20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019435b0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '333', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c1b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7eb30> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinJSONLD_Unexpected_Conneg> def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:654: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c1b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7eb30> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c7e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e7b0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinJSON_Conneg> def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:570: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c7e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e7b0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172ce10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d390> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinN3_Unexpected_Conneg> def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:612: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172ce10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d390> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinTSV _____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d710> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '430', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinTSV> def testAskByPOSTinTSV(self): > result = self.__generic(askQuery, TSV, POST) test/test_fuseki2__v3_8_0__stw.py:535: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d710> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '430', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinTSV_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c9f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d390> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinTSV_Conneg> def testAskByPOSTinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:542: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c9f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d390> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172d860>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ef90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow> def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) test/test_fuseki2__v3_8_0__stw.py:680: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172d860>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ef90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172da70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941390> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow_Conneg> def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:689: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172da70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941390> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172dd30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f770> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinXML> def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) test/test_fuseki2__v3_8_0__stw.py:477: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172dd30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f770> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172dff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f150> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testAskByPOSTinXML_Conneg> def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:485: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172dff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f150> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172d230>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e510> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:878: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172d230>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e510> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172cc00>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD> def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) test/test_fuseki2__v3_8_0__stw.py:835: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172cc00>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172e150>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e6d0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD_Conneg> def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:843: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172e150>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e6d0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172ed00>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123150> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected> def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) test/test_fuseki2__v3_8_0__stw.py:905: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172ed00>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123150> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172ef10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122190> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected_Conneg> def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:914: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172ef10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122190> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172e6d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940670> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:810: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172e6d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940670> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172f120>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940130> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:742: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172f120>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940130> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019417f0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE_Conneg> def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019417f0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172fcd0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019404b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinUnknow> def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_fuseki2__v3_8_0__stw.py:939: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172fcd0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019404b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172f750>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940d70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:947: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172f750>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940d70> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c7e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940750> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinXML> def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_fuseki2__v3_8_0__stw.py:704: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172c7e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940750> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172fe30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019427b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:711: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172fe30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019427b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152c730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940ad0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinCSV_Unexpected_Conneg> def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:897: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152c730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940ad0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinJSONLD ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152c310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942dd0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '696', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinJSONLD> def testConstructByPOSTinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, POST) test/test_fuseki2__v3_8_0__stw.py:851: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152c310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942dd0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '696', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152c5d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940ad0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinJSONLD_Conneg> def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:859: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152c5d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940ad0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152d180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7edd0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinJSON_Unexpected> def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) test/test_fuseki2__v3_8_0__stw.py:922: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152d180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7edd0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152d440>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c2f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinJSON_Unexpected_Conneg> def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:931: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152d440>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c2f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152c940>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f5b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinN3_Conneg> def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:827: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152c940>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f5b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152cb50>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e430> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinRDFXML_Conneg> def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:759: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152cb50>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e430> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152db20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f690> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinTURTLE_Conneg> def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:793: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152db20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f690> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152c940>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7dc50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow> def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) test/test_fuseki2__v3_8_0__stw.py:955: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152c940>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7dc50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152e410>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ec10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow_Conneg> def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:963: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152e410>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ec10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152e570>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c4b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinXML> def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) test/test_fuseki2__v3_8_0__stw.py:718: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152e570>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c4b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152efc0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e970> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testConstructByPOSTinXML_Conneg> def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:725: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152efc0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e970> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152e360>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f230> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152e360>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f230> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152dc80>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD> def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) test/test_fuseki2__v3_8_0__stw.py:1107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152dc80>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152f6a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e6d0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD_Conneg> def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1114: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152f6a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e6d0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152f070>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected> def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) test/test_fuseki2__v3_8_0__stw.py:1174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152f070>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152f330>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122eb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected_Conneg> def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1183: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0152f330>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122eb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb81b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e0b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinN3_Conneg> def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1083: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb81b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e0b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021203d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1015: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021203d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172ef10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941010> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE_Conneg> def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1049: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0172ef10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941010> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8f70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942f90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow> def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_fuseki2__v3_8_0__stw.py:1208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8f70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942f90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb9180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019433f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb9180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019433f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb95a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943a10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinXML> def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_fuseki2__v3_8_0__stw.py:977: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb95a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943a10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb92e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942970> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:984: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb92e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942970> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940750> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinCSV_Unexpected_Conneg> def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940750> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb95a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943bd0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '495', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSONLD> def testDescribeByPOSTinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, POST) test/test_fuseki2__v3_8_0__stw.py:1121: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb95a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943bd0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '495', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8520>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7ef90> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSONLD_Conneg> def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1128: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8520>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7ef90> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb97b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019411d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '320', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSON_Unexpected> def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) test/test_fuseki2__v3_8_0__stw.py:1191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb97b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019411d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '320', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fba410>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940ad0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSON_Unexpected_Conneg> def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1200: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fba410>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940ad0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8890>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d1d0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinN3_Conneg> def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8890>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d1d0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb9e90>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c590> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinRDFXML_Conneg> def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1032: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb9e90>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c590> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fbadb0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f5b0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinTURTLE_Conneg> def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1066: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fbadb0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f5b0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8890>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943bd0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow> def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) test/test_fuseki2__v3_8_0__stw.py:1224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8890>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943bd0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fbb750>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7def0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow_Conneg> def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:1232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fbb750>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7def0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fbb1d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019434d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML> def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) test/test_fuseki2__v3_8_0__stw.py:991: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fbb1d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019434d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fbb330>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d8d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML_Conneg> def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:998: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fbb330>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d8d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fbbee0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942c10> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_fuseki2__v3_8_0__stw.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fbbee0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942c10> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testQueryBadFormed> def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1242: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f881b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5dfd0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testQueryDuplicatedPrefix> def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f881b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5dfd0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88e10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942350> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1245: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88e10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942350> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f89020>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5dd30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testQueryWithComma_1> def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1261: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f89020>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5dd30> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88aa0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e6d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88aa0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e6d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88cb0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942890> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinCSV> def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_fuseki2__v3_8_0__stw.py:250: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88cb0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942890> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ef90> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ef90> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f897b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021210f0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinJSON> def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_fuseki2__v3_8_0__stw.py:306: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f897b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021210f0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88aa0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021220b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:390: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88aa0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021220b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8520>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e970> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00fb8520>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e970> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88cb0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7dd30> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected_Conneg> def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:348: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88cb0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7dd30> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8a990>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7cbb0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinTSV> def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_fuseki2__v3_8_0__stw.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8a990>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7cbb0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8a570>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d7f0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8a570>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d7f0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8a830>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7dc50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinUnknow> def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_fuseki2__v3_8_0__stw.py:419: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8a830>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7dc50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8aba0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c590> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8aba0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c590> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8ae60>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e270> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinXML> def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_fuseki2__v3_8_0__stw.py:218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8ae60>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e270> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8afc0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:226: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8afc0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8b280>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943850> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV> def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) test/test_fuseki2__v3_8_0__stw.py:264: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8b280>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943850> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8b490>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942eb0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV_Conneg> def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8b490>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942eb0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88f70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7de10> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '387', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON> def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) test/test_fuseki2__v3_8_0__stw.py:320: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f88f70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7de10> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '387', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8aba0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943cb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected_Conneg> def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8aba0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943cb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8af10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f3f0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON_Conneg> def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f8af10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f3f0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149c310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941470> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinN3_Unexpected_Conneg> def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:369: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149c310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941470> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149d180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7de10> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '537', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV> def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) test/test_fuseki2__v3_8_0__stw.py:292: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149d180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7de10> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '537', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149cd60>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940670> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV_Conneg> def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:299: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149cd60>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940670> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149d020>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow> def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) test/test_fuseki2__v3_8_0__stw.py:437: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149d020>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149d390>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943770> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow_Conneg> def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:446: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149d390>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943770> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149d650>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d8d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinXML> def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) test/test_fuseki2__v3_8_0__stw.py:234: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149d650>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d8d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149d7b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942970> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_fuseki2__v3_8_0__stw.SPARQLWrapperTests testMethod=testSelectByPOSTinXML_Conneg> def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_fuseki2__v3_8_0__stw.py:242: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149d7b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942970> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149c3c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121550> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected_Conneg> def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:663: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149c3c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121550> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149ce10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f770> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:585: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149ce10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f770> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149e780>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5fd90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected_Conneg> def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:621: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149e780>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5fd90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149de90>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121e10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:702: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149de90>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121e10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149f800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942c10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:488: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149f800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942c10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149f070>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123cb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testAskByPOSTinJSONLD_Unexpected_Conneg> def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:684: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149f070>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123cb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0c9f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943070> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testAskByPOSTinJSON_Conneg> def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:600: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0c9f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943070> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149de90>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940ad0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testAskByPOSTinN3_Unexpected_Conneg> def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:642: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0149de90>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940ad0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0c3c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940590> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow_Conneg> def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:721: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0c3c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940590> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0d020>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941c50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testAskByPOSTinXML_Conneg> def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:505: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0d020>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941c50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0d650>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019403d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:898: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0d650>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f019403d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0db20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941c50> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD_Conneg> def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:864: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0db20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941c50> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0e0a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e970> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected_Conneg> def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:936: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0e0a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e970> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0e570>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e270> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:834: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0e570>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e270> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0e6d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940e50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:774: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0e6d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940e50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0ef10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7edd0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE_Conneg> def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:804: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0ef10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7edd0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0f3e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941470> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:972: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0f3e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941470> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0f800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:744: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0f800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0fcd0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7db70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByPOSTinCSV_Unexpected_Conneg> def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:917: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0fcd0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7db70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0fe30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d9b0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByPOSTinJSONLD_Conneg> def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:879: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00f0fe30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d9b0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01424260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940750> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByPOSTinJSON_Unexpected_Conneg> def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:955: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01424260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940750> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f014249f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d9b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByPOSTinN3_Conneg> def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:849: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f014249f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d9b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01425180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02120750> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByPOSTinRDFXML_Conneg> def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01425180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02120750> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01425650>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942510> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByPOSTinTURTLE_Conneg> def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:819: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01425650>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942510> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01425a70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122890> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow_Conneg> def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:989: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01425a70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122890> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01425f40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02120c90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testConstructByPOSTinXML_Conneg> def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:759: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01425f40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02120c90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f014254f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f4d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f014254f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f4d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01426780>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122890> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD_Conneg> def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1132: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01426780>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122890> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01426db0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d2b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected_Conneg> def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01426db0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d2b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f014271d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123af0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByGETinN3_Conneg> def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1102: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f014271d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123af0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01427800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f230> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01427800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f230> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01427b70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122190> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE_Conneg> def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1072: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01427b70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122190> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01427cd0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121390> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01427cd0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121390> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01427ac0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941470> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1012: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01427ac0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941470> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01424260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943930> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByPOSTinCSV_Unexpected_Conneg> def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01424260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943930> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01250520>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942eb0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSONLD_Conneg> def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01250520>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942eb0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01250890>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941e10> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSON_Unexpected_Conneg> def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1223: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01250890>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941e10> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012510d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942510> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByPOSTinN3_Conneg> def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012510d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942510> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01251700>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByPOSTinRDFXML_Conneg> def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1057: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01251700>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943690> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01251b20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123e70> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByPOSTinTURTLE_Conneg> def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1087: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01251b20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123e70> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01251ff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942dd0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow_Conneg> def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01251ff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942dd0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01252410>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021202f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML_Conneg> def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:1027: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01252410>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021202f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012515a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941010> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_graphdbEnterprise__v8_9_0__rs.py:1286: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012515a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941010> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testQueryBadFormed> def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_graphdbEnterprise__v8_9_0__rs.py:1268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01252c50>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123d90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_graphdbEnterprise__v8_9_0__rs.py:1271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01252c50>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123d90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01252a40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943e70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testQueryWithComma_1> def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_graphdbEnterprise__v8_9_0__rs.py:1290: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01252a40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943e70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01253490>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021235b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_graphdbEnterprise__v8_9_0__rs.py:1298: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01253490>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021235b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01253960>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121b70> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01253960>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121b70> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01251f40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122c10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01251f40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122c10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012538b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021212b0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:329: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012538b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021212b0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b0310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942a50> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected_Conneg> def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b0310>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942a50> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b0cb0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c4b0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:299: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b0cb0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c4b0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b12e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021219b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:445: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b12e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021219b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b1700>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7cbb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b1700>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7cbb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b1bd0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d630> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV_Conneg> def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b1bd0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d630> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b1d30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c2f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected_Conneg> def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b1d30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c2f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b1f40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d390> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON_Conneg> def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b1f40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d390> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b2ba0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942190> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByPOSTinN3_Unexpected_Conneg> def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b2ba0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01942190> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b2fc0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5db70> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV_Conneg> def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:314: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b2fc0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5db70> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b35f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow_Conneg> def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f012b35f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01251b20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f4d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_graphdbEnterprise__v8_9_0__rs.SPARQLWrapperTests testMethod=testSelectByPOSTinXML_Conneg> def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_graphdbEnterprise__v8_9_0__rs.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01251b20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f4d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b2ba0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7def0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinCSV> def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_lov-fuseki_on_hold.py:536: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b2ba0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7def0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b3e30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d390> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinCSV_Conneg> def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:543: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b3e30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d390> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b1bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f850> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinJSON> def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_lov-fuseki_on_hold.py:604: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b1bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f850> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b3750> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7de10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected> def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) test/test_lov-fuseki_on_hold.py:687: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b3750> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7de10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b3e30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected_Conneg> def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b3e30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b0cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:611: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b0cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c2f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected> def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) test/test_lov-fuseki_on_hold.py:641: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c2f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123930> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected_Conneg> def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:651: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123930> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e890> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinTSV> def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) test/test_lov-fuseki_on_hold.py:570: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e890> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98520> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021233f0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinTSV_Conneg> def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:577: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98520> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021233f0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e989f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123e70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinUnknow> def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_lov-fuseki_on_hold.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e989f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123e70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e987e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e350> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:740: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e987e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e350> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021204b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinXML> def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_lov-fuseki_on_hold.py:498: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021204b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98aa0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e350> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:506: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e98aa0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e350> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e9ba10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f00cab070> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected> def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) test/test_lov-fuseki_on_hold.py:967: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e9ba10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f00cab070> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e9b8b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f150> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:976: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e9b8b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f150> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e9bcd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d7f0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD> def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) test/test_lov-fuseki_on_hold.py:928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e9bcd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d7f0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c781b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942190> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD_Conneg> def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:936: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c781b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942190> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78e10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5da90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected> def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) test/test_lov-fuseki_on_hold.py:1010: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78e10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5da90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943770> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected_Conneg> def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1019: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943770> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c2f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinN3> def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) test/test_lov-fuseki_on_hold.py:890: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c2f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c785d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941010> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:898: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c785d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941010> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943930> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML> def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_lov-fuseki_on_hold.py:814: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943930> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:822: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78c00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE> def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) test/test_lov-fuseki_on_hold.py:852: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78c00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e989f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121010> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE_Conneg> def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:860: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00e989f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121010> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b35f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120050> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinUnknow> def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_lov-fuseki_on_hold.py:1052: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f012b35f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120050> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c794f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021210f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1060: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c794f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021210f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78e10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122f90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinXML> def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_lov-fuseki_on_hold.py:779: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c78e10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122f90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c781b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122510> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:786: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00c781b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122510> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14520> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019409f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected> def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) test/test_lov-fuseki_on_hold.py:1280: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14520> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019409f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14c00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940e50> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14c00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940e50> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d145d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941fd0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD> def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) test/test_lov-fuseki_on_hold.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d145d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941fd0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943690> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD_Conneg> def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943690> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14ec0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942890> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected> def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) test/test_lov-fuseki_on_hold.py:1323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14ec0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942890> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14c00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941630> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected_Conneg> def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14c00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941630> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f3f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinN3> def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) test/test_lov-fuseki_on_hold.py:1207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f3f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14520> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943af0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinN3_Conneg> def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14520> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943af0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d15180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fd90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML> def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_lov-fuseki_on_hold.py:1131: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d15180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fd90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cc90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1139: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7cc90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d155a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5de10> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE> def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) test/test_lov-fuseki_on_hold.py:1169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d155a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5de10> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d15910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019403d0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE_Conneg> def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1177: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d15910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019403d0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d15a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f4d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow> def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_lov-fuseki_on_hold.py:1365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d15a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f4d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d15d30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e970> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1373: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d15d30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e970> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14520> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f230> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinXML> def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_lov-fuseki_on_hold.py:1096: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14520> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f230> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d16620> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e5f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:1103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d16620> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e5f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd1860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e970> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_lov-fuseki_on_hold.py:1423: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd1860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e970> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testQueryDuplicatedPrefix> def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_lov-fuseki_on_hold.py:1414: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd1e90> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c910> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_lov-fuseki_on_hold.py:1411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd1e90> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c910> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120bb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_lov-fuseki_on_hold.py:1443: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120bb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2e60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123bd0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinCSV> def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_lov-fuseki_on_hold.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2e60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123bd0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2a40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dd30> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2a40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dd30> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d15910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940130> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinJSON> def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_lov-fuseki_on_hold.py:323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d15910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940130> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14ec0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941010> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected> def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) test/test_lov-fuseki_on_hold.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00d14ec0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941010> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2e60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940210> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:416: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2e60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940210> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd1c80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941c50> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd1c80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941c50> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd3490> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c590> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected> def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) test/test_lov-fuseki_on_hold.py:360: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd3490> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c590> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2fc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942eb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected_Conneg> def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:370: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2fc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942eb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942190> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinTSV> def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_lov-fuseki_on_hold.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942190> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd3c20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941b70> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd3c20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941b70> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd3d80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fcb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinUnknow> def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_lov-fuseki_on_hold.py:450: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd3d80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7fcb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd3ac0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e7b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd3ac0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e7b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2e60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019420b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinXML> def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_lov-fuseki_on_hold.py:217: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00cd2e60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019420b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01354100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e0b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_lov-fuseki_on_hold.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_lov-fuseki_on_hold.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f01354100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e0b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f013575f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132fe70> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected_Conneg> def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:674: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f013575f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132fe70> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01357ee0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132f5b0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:591: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01357ee0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132f5b0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01357cd0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132cd70> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected_Conneg> def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:628: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01357cd0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132cd70> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01304940>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132f150> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_rdf4j__geosciml.py:716: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01304940>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132f150> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01304cb0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132de10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01304cb0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132de10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01305b20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123930> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testAskByPOSTinJSONLD_Unexpected_Conneg> def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01305b20>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123930> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01305f40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d390> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testAskByPOSTinJSON_Conneg> def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:606: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01305f40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d390> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01306200>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132f150> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testAskByPOSTinN3_Unexpected_Conneg> def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:651: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01306200>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132f150> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f013066d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122270> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow_Conneg> def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_rdf4j__geosciml.py:735: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f013066d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122270> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f013073e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123cb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testAskByPOSTinXML_Conneg> def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:511: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f013073e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123cb0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01307e30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e7b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f01307e30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e7b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdc260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123230> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD_Conneg> def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:878: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdc260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123230> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdcb50>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132c670> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected_Conneg> def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:950: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdcb50>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132c670> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdc730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122430> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:848: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdc730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122430> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdc100>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122970> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdc100>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122970> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00cd2a40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132ef90> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE_Conneg> def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:818: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00cd2a40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132ef90> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdcb50>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132ea50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_rdf4j__geosciml.py:986: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdcb50>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132ea50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bddd30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e6d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:758: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bddd30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e6d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bddc80>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d630> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByPOSTinCSV_Unexpected_Conneg> def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:931: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bddc80>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d630> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bddf40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e6d0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByPOSTinJSONLD_Conneg> def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:893: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bddf40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e6d0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bde4c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132faf0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByPOSTinJSON_Unexpected_Conneg> def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:969: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bde4c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132faf0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdea40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021235b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByPOSTinN3_Conneg> def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:863: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdea40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021235b0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bde570>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e5f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByPOSTinRDFXML_Conneg> def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:803: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bde570>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e5f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bddf40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e7b0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByPOSTinTURTLE_Conneg> def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:833: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bddf40>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e7b0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bde570>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow_Conneg> def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1003: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bde570>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be83c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021210f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testConstructByPOSTinXML_Conneg> def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:773: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be83c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021210f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be8730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121630> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be8730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121630> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be92e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132def0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD_Conneg> def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1146: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be92e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132def0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be9230>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121d30> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected_Conneg> def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be9230>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121d30> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be95a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d470> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByGETinN3_Conneg> def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1116: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be95a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d470> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be8c00>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f930> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1056: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be8c00>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f930> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be8680>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d1d0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE_Conneg> def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1086: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be8680>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d1d0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be9440>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5c9f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be9440>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5c9f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be9180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f230> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:1025: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be9180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f230> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be9e90>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5d630> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByPOSTinCSV_Unexpected_Conneg> def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be9e90>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5d630> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be8680>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ecf0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSONLD_Conneg> def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1161: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00be8680>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ecf0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00beb960>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132cd70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSON_Unexpected_Conneg> def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00beb960>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132cd70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d88680>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7ef90> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByPOSTinN3_Conneg> def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1131: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d88680>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7ef90> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d88730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f5b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByPOSTinRDFXML_Conneg> def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1071: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d88730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f5b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d89390>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132faf0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByPOSTinTURTLE_Conneg> def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1101: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d89390>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132faf0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d892e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f3f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow_Conneg> def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d892e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f3f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdcb50>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121e10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML_Conneg> def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:1041: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00bdcb50>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121e10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d89a70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123150> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_rdf4j__geosciml.py:1305: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d89a70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02123150> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryBadFormed_1 ____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testQueryBadFormed_1> def testQueryBadFormed_1(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed_1, XML, GET) test/test_rdf4j__geosciml.py:1282: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d89650>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021225f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_rdf4j__geosciml.py:1289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d89650>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021225f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d89390>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121c50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testQueryWithComma_1> def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_rdf4j__geosciml.py:1309: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d89390>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121c50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d8a360>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021202f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_rdf4j__geosciml.py:1317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d8a360>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021202f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d8a6d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02120210> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d8a6d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02120210> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d8b3e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d8b3e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e5f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d8ae60>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121e10> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d8ae60>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121e10> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d89ff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132c750> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected_Conneg> def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:363: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d89ff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132c750> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d8bac0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122350> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d8bac0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122350> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d683c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e270> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_rdf4j__geosciml.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d683c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e270> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d68ec0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021234d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_rdf4j__geosciml.py:233: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d68ec0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021234d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d68e10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e430> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV_Conneg> def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d68e10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132e430> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d69860>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132fd90> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected_Conneg> def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:432: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d69860>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132fd90> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d699c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132db70> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON_Conneg> def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d699c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132db70> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d69d30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021210f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByPOSTinN3_Unexpected_Conneg> def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d69d30>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021210f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d6a0a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132edd0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV_Conneg> def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d6a0a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132edd0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d6a4c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021234d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow_Conneg> def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_rdf4j__geosciml.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d6a4c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021234d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d6a200>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132edd0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_rdf4j__geosciml.SPARQLWrapperTests testMethod=testSelectByPOSTinXML_Conneg> def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_rdf4j__geosciml.py:250: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00d6a200>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132edd0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094c050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ec10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected_Conneg> def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_stardog__lindas.py:678: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094c050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ec10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094c3c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ef90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_stardog__lindas.py:595: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094c3c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ef90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094c7e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132def0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected_Conneg> def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_stardog__lindas.py:632: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094c7e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132def0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094c9f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f310> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_stardog__lindas.py:720: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094c9f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f310> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094db20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132c3d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_stardog__lindas.py:498: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094db20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132c3d0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094e780> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943e70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testAskByPOSTinJSONLD_Unexpected_Conneg> def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) test/test_stardog__lindas.py:701: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094e780> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943e70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094ea40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testAskByPOSTinJSON_Conneg> def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_stardog__lindas.py:610: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094ea40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094f120> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021210f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testAskByPOSTinN3_Unexpected_Conneg> def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) test/test_stardog__lindas.py:655: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094f120> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021210f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094f1d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123310> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testAskByPOSTinUnknow_Conneg> def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) test/test_stardog__lindas.py:739: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094f1d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123310> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094fcd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021225f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testAskByPOSTinXML_Conneg> def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) test/test_stardog__lindas.py:515: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094fcd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021225f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094fa10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120590> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_stardog__lindas.py:916: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094fa10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120590> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02873690> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD_Conneg> def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_stardog__lindas.py:882: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02873690> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021203d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected_Conneg> def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_stardog__lindas.py:954: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021203d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123850> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_stardog__lindas.py:852: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123850> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4310> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120590> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_stardog__lindas.py:792: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4310> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120590> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ef90> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE_Conneg> def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_stardog__lindas.py:822: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4cb0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ef90> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c47e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120590> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_stardog__lindas.py:990: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c47e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120590> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ec10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_stardog__lindas.py:762: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c4940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ec10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c5020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d7f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByPOSTinCSV_Unexpected_Conneg> def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_stardog__lindas.py:935: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c5020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d7f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c55a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021203d0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByPOSTinJSONLD_Conneg> def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) test/test_stardog__lindas.py:897: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c55a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021203d0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c5a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120bb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByPOSTinJSON_Unexpected_Conneg> def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) test/test_stardog__lindas.py:973: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c5a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120bb0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c73e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e350> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByPOSTinN3_Conneg> def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) test/test_stardog__lindas.py:867: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c73e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e350> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c75f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByPOSTinRDFXML_Conneg> def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) test/test_stardog__lindas.py:807: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f009c75f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5d630> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c9f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132ec10> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByPOSTinTURTLE_Conneg> def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) test/test_stardog__lindas.py:837: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c9f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132ec10> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1ce10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120590> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow_Conneg> def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_stardog__lindas.py:1007: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1ce10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120590> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f4d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testConstructByPOSTinXML_Conneg> def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) test/test_stardog__lindas.py:777: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f4d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f150> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_stardog__lindas.py:1183: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f150> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1dd30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132ecf0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD_Conneg> def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_stardog__lindas.py:1149: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1dd30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132ecf0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122f90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected_Conneg> def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_stardog__lindas.py:1221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c940> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122f90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1cd60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e510> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByGETinN3_Conneg> def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_stardog__lindas.py:1119: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1cd60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e510> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1dc80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132fa10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_stardog__lindas.py:1059: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1dc80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132fa10> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1d860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132cbb0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE_Conneg> def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_stardog__lindas.py:1089: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1d860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132cbb0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1d4f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eeb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_stardog__lindas.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1d4f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eeb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094ea40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c9f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_stardog__lindas.py:1029: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0094ea40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c9f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1ea40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c590> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByPOSTinCSV_Unexpected_Conneg> def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) test/test_stardog__lindas.py:1202: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1ea40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c590> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d2b0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSONLD_Conneg> def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) test/test_stardog__lindas.py:1164: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d2b0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1e8e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f930> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByPOSTinJSON_Unexpected_Conneg> def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) test/test_stardog__lindas.py:1240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1e8e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f930> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00724730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e5f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByPOSTinN3_Conneg> def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) test/test_stardog__lindas.py:1134: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00724730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e5f0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00725020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dc50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByPOSTinRDFXML_Conneg> def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) test/test_stardog__lindas.py:1074: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00725020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dc50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00724310> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByPOSTinTURTLE_Conneg> def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) test/test_stardog__lindas.py:1104: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00724310> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f007243c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132cad0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByPOSTinUnknow_Conneg> def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) test/test_stardog__lindas.py:1274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f007243c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132cad0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00725860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testDescribeByPOSTinXML_Conneg> def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) test/test_stardog__lindas.py:1044: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00725860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e6d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00725bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f070> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_stardog__lindas.py:1307: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00725bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f070> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00725230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5db70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testQueryDuplicatedPrefix> def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_stardog__lindas.py:1298: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00725230> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5db70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00726f10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_stardog__lindas.py:1293: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00726f10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e5f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00726db0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5db70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testQueryWithComma_1> def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_stardog__lindas.py:1311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00726db0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5db70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f007259c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d7f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_stardog__lindas.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f007259c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d7f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00726990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d390> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_stardog__lindas.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00726990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d390> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00727c20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132ef90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_stardog__lindas.py:413: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00727c20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132ef90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00727e30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e270> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_stardog__lindas.py:330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00727e30> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e270> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079c100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132f150> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected_Conneg> def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_stardog__lindas.py:367: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079c100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132f150> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079c520> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123690> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_stardog__lindas.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079c520> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123690> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079c5d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122430> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_stardog__lindas.py:455: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079c5d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122430> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079cf70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132c130> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_stardog__lindas.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079cf70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132c130> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079d0d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121b70> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByPOSTinCSV_Conneg> def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) test/test_stardog__lindas.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079d0d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121b70> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079dff0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120d70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected_Conneg> def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_stardog__lindas.py:436: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079dff0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120d70> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079cd60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e6d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByPOSTinJSON_Conneg> def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) test/test_stardog__lindas.py:345: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079cd60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e6d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079e6d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021204b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByPOSTinN3_Unexpected_Conneg> def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) test/test_stardog__lindas.py:390: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079e6d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021204b0> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079e830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122f90> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByPOSTinTSV_Conneg> def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) test/test_stardog__lindas.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079e830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122f90> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079e570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019425f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByPOSTinUnknow_Conneg> def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) test/test_stardog__lindas.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0079e570> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019425f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e270> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_stardog__lindas.SPARQLWrapperTests testMethod=testSelectByPOSTinXML_Conneg> def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) test/test_stardog__lindas.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00b1c470> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e270> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0079f3e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f850> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testAskByGETinCSV> def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_store__v1_1_4.py:520: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0079f3e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7f850> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0079c470>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7fe70> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testAskByGETinCSV_Conneg> def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:527: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0079c470>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7fe70> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0079fee0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7edd0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testAskByGETinJSON> def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_store__v1_1_4.py:583: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0079fee0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7edd0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0079f6a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e430> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testAskByGETinJSONLD_Unexpected_Conneg> def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) test/test_store__v1_1_4.py:673: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f0079f6a0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e430> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00914aa0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d1d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_store__v1_1_4.py:590: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00914aa0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d1d0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f009141b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7de10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testAskByGETinN3_Unexpected_Conneg> def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) test/test_store__v1_1_4.py:627: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f009141b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7de10> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f009145d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e510> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testAskByGETinTSV_Conneg> def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:560: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f009145d0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e510> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00914940>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ecf0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_store__v1_1_4.py:718: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00914940>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ecf0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f009159c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d2b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_store__v1_1_4.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f009159c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d2b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d0940>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941fd0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:942: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d0940>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941fd0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d1390>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132ecf0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testConstructByGETinJSON_Unexpected_Conneg> def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) test/test_store__v1_1_4.py:981: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d1390>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132ecf0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d14f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e510> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_store__v1_1_4.py:872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d14f0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e510> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d07e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132c130> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_store__v1_1_4.py:797: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d07e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132c130> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d1ff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c670> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE_Conneg> def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_store__v1_1_4.py:834: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d1ff0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c670> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d1700>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_store__v1_1_4.py:1020: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d1700>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f0132d550> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d2d00>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5dd30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_store__v1_1_4.py:763: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008d2d00>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5dd30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008e64c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f00880670> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:1246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008e64c0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f00880670> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008e6830>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021233f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testDescribeByGETinJSON_Unexpected_Conneg> def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) test/test_store__v1_1_4.py:1287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008e6830>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f021233f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008e6990>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122f90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_store__v1_1_4.py:1097: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008e6990>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02122f90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008e7b70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121d30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_store__v1_1_4.py:1326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008e7b70>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121d30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa0260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940910> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_store__v1_1_4.py:1062: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa0260>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940910> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa2fc0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7ef90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_store__v1_1_4.py:1371: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa2fc0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7ef90> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testQueryBadFormed> def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_store__v1_1_4.py:1356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa3070>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e7b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testQueryDuplicatedPrefix> def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_store__v1_1_4.py:1362: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa3070>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7e7b0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa3a10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d710> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_store__v1_1_4.py:1359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa3a10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d710> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa3800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f070> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_store__v1_1_4.py:1387: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa3800>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5f070> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa3a10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ec10> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testSelectByGETinCSV> def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_store__v1_1_4.py:247: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00aa3a10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ec10> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008641b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d1d0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008641b0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7d1d0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00864e10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ec10> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testSelectByGETinJSON> def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_store__v1_1_4.py:310: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00864e10>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5ec10> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00864890>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c590> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_store__v1_1_4.py:403: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00864890>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01d7c590> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00864730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e270> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_store__v1_1_4.py:317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00864730>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02b5e270> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00865180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943af0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testSelectByGETinN3_Unexpected_Conneg> def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) test/test_store__v1_1_4.py:357: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00865180>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01943af0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008652e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940e50> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_store__v1_1_4.py:287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f008652e0>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01940e50> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00865e90>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941630> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_store__v1_1_4.py:448: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00865e90>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f01941630> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00866410>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121390> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_store__v1_1_4.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_store__v1_1_4.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPHandler object at 0x7f9f02755a90> http_class = <class 'http.client.HTTPConnection'> req = <urllib.request.Request object at 0x7f9f00866410>, http_conn_args = {} host = '127.0.0.1:9', h = <http.client.HTTPConnection object at 0x7f9f02121390> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00578f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eeb0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinCSV> def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:526: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00578f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eeb0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00579180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e890> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinCSV_Conneg> def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:533: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00579180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e890> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00579910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7def0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinJSON> def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:586: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00579910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7def0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00579700> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121ef0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00579700> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121ef0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005781b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinTSV> def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005781b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00579440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f070> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinTSV_Conneg> def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:563: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00579440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f070> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00578100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinUnknow> def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_virtuoso__v7_20_3230__dbpedia.py:728: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00578100> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c830> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0057a410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dc50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:737: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0057a410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dc50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005799c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinXML> def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005799c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d550> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0057a830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e270> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0057a830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e270> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0057afc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120f30> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '228', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testAskByPOSTinJSON_Conneg> def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:608: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0057afc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120f30> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '228', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005305d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132da90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:950: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005305d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132da90> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD> def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:895: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00530680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f930> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD_Conneg> def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:904: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00530680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5f930> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122c10> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinN3> def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:866: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122c10> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00530f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:873: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00530f70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531b20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121c50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML> def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:802: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531b20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121c50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531de0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121d30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:809: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531de0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121d30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531f40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122f90> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE> def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:833: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531f40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122f90> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005322b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123af0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE_Conneg> def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:841: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005322b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123af0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00530890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123770> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinUnknow> def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1048: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00530890> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123770> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005326d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120210> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1056: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005326d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120210> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00532990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dc50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinXML> def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:772: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00532990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7dc50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00532ba0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132f850> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:779: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00532ba0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132f850> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005322b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123690> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByPOSTinCSV_Unexpected_Conneg> def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:977: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005322b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123690> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e430> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByPOSTinN3> def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) test/test_virtuoso__v7_20_3230__dbpedia.py:880: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e430> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f004245d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942350> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testConstructByPOSTinUnknow_Conneg> def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1073: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f004245d0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942350> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00424aa0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d7f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00424aa0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d7f0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00424680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021233f0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD> def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00424680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021233f0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00425910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f230> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD_Conneg> def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00425910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f230> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00424730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940130> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinN3> def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1181: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00424730> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940130> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinN3_Conneg> def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00531180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5cd70> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005799c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dc50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML> def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005799c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dc50> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00425180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c9f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00425180> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5c9f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f004249f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e970> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE> def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f004249f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e970> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00424680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132f770> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE_Conneg> def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00424680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132f770> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00426780> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eeb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow> def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1364: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00426780> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eeb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00425a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132cad0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1372: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00425a70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132cad0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00426ba0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fd90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinXML> def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1087: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00426ba0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fd90> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00426e60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132fcb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:1094: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00426e60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132fcb0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00508d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f00464210> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_virtuoso__v7_20_3230__dbpedia.py:1416: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00508d60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f00464210> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testQueryBadFormed> def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1401: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00508ec0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943770> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testQueryDuplicatedPrefix> def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00508ec0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943770> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005097b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942f90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1404: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005097b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01942f90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00509440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021225f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00509440> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021225f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005092e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943e70> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinCSV> def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f005092e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943e70> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a4c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122350> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a4c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122350> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a620> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123cb0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinJSON> def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a620> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123cb0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943690> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected> def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a990> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943690> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00509020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120210> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00509020> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120210> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a4c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943690> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a4c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943690> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7de10> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinTSV> def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7de10> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050af10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123850> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050af10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123850> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050ba10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021209f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinUnknow> def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_virtuoso__v7_20_3230__dbpedia.py:448: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050ba10> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021209f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ce50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:457: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7ce50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050b6a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940050> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinXML> def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050b6a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940050> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050b3e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050b3e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7da90> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00698680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132ef90> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '349', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected> def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) test/test_virtuoso__v7_20_3230__dbpedia.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00698680> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132ef90> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '349', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '278', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v7_20_3230__dbpedia.SPARQLWrapperTests testMethod=testSelectByPOSTinJSONLD_Unexpected_Conneg> def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) test/test_virtuoso__v7_20_3230__dbpedia.py:439: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0050a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7f690> headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '278', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00699910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021212b0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinCSV> def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:528: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00699910> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021212b0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069a4c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021225f0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinCSV_Conneg> def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:535: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069a4c0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021225f0> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069a150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122190> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinJSON> def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:588: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069a150> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122190> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069a410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123850> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinJSON_Conneg> def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:595: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069a410> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02123850> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069aaf0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e430> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinTSV> def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069aaf0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e430> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00699ff0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021233f0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinTSV_Conneg> def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:565: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00699ff0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021233f0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069b490> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120750> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinUnknow> def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) test/test_virtuoso__v8_03_3313__dbpedia.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069b490> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02120750> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c590> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinUnknow_Conneg> def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:740: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069a360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c590> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069b8b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121c50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinXML> def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069b8b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02121c50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069b330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7db70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testAskByGETinXML_Conneg> def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:502: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0069b330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7db70> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064d860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030c830> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinCSV_Unexpected_Conneg> def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:954: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064d860> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030c830> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064e360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eeb0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD> def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:899: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064e360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eeb0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064df40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinJSONLD_Conneg> def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:908: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064df40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5fcb0> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064dde0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132f4d0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinN3> def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:869: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064dde0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132f4d0> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064dc80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e890> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinN3_Conneg> def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:876: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064dc80> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e890> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064ed00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d8d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML> def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:805: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064ed00> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d8d0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064efc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d010> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinRDFXML_Conneg> def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:812: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064efc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d010> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064f120> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132c210> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE> def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:836: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064f120> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132c210> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064f490> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d2b0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinTURTLE_Conneg> def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:844: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064f490> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d2b0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064db20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132c130> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinUnknow> def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1053: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064db20> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132c130> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064f8b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d0f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinUnknow_Conneg> def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1061: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064f8b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132d0f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064e780> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941470> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinXML> def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:775: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064e780> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01941470> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064f5f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019404b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testConstructByGETinXML_Conneg> def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:782: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064f5f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019404b0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00315de0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f002e18d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinCSV_Unexpected_Conneg> def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1272: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00315de0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f002e18d0> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00315f40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030fa10> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD> def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1217: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00315f40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030fa10> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064f5f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ea50> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinJSONLD_Conneg> def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1226: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f0064f5f0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5ea50> headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00316360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030c910> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinN3> def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00316360> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030c910> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00316830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5de10> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinN3_Conneg> def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00316830> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5de10> headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00316fc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eb30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML> def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1123: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00316fc0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5eb30> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f003162b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030c750> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinRDFXML_Conneg> def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f003162b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030c750> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00317070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dc50> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE> def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1154: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00317070> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5dc50> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00317330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030dfd0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinTURTLE_Conneg> def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00317330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030dfd0> headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00317540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow> def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1370: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00317540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02b5e430> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00317800> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030d390> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinUnknow_Conneg> def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00317800> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030d390> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00317960> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021233f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinXML> def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1093: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00317960> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f021233f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00316620> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030d7f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testDescribeByGETinXML_Conneg> def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00316620> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030d7f0> headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002e9bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e510> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testKeepAlive> def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_virtuoso__v8_03_3313__dbpedia.py:1422: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002e9bd0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0132e510> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testQueryBadFormed> def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002ea8e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943a10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testQueryDuplicatedPrefix> def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1413: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002ea8e0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943a10> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002ea0a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940130> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testQueryManyPrefixes> def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1410: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002ea0a0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940130> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002ea200> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030c2f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testQueryWithComma_1> def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1426: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002ea200> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030c2f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eae60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943070> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testQueryWithComma_3> def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eae60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943070> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002ea2b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030c050> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinCSV> def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002ea2b0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030c050> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eb330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940210> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinCSV_Conneg> def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eb330> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01940210> headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eb540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019433f0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinJSON> def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eb540> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019433f0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eb800> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c910> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected> def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eb800> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7c910> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eb960> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d630> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinJSONLD_Unexpected_Conneg> def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eb960> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7d630> headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002e9de0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943af0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinJSON_Conneg> def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002e9de0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01943af0> headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002ebb70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e890> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinTSV> def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002ebb70> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e890> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00378050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019434d0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinTSV_Conneg> def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00378050> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f019434d0> headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00378260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e970> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinUnknow> def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) test/test_virtuoso__v8_03_3313__dbpedia.py:450: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00378260> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f01d7e970> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00378aa0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122890> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinUnknow_Conneg> def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00378aa0> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f02122890> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eae60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030d7f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinXML> def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f002eae60> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030d7f0> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00315f40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030dc50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = <object object at 0x7f9f04858cd0> source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = <test.test_virtuoso__v8_03_3313__dbpedia.SPARQLWrapperTests testMethod=testSelectByGETinXML_Conneg> def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) test/test_virtuoso__v8_03_3313__dbpedia.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib.request.HTTPSHandler object at 0x7f9f027563c0> http_class = <class 'http.client.HTTPSConnection'> req = <urllib.request.Request object at 0x7f9f00315f40> http_conn_args = {'context': <ssl.SSLContext object at 0x7f9f0222e2a0>} host = '127.0.0.1:9', h = <http.client.HTTPSConnection object at 0x7f9f0030dc50> headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: <urlopen error [Errno 111] Connection refused> /usr/lib/python3.13/urllib/request.py:1322: URLError _________________________ QueryResult_Test.testConvert _________________________ self = <test.test_wrapper.QueryResult_Test testMethod=testConvert> def testConvert(self): class FakeResponse(object): def __init__(self, content_type): self.content_type = content_type def info(self): return {"content-type": self.content_type} def read(self, len): return "" def _mime_vs_type(mime, requested_type): """ :param mime: mimetype/Content-Type of the response :param requested_type: requested mimetype (alias) :return: number of warnings produced by combo """ with warnings.catch_warnings(record=True) as w: qr = QueryResult((FakeResponse(mime), requested_type)) try: qr.convert() except: pass # if len(w) > 0: print(w[0].message) # FOR DEBUG # if len(w) > 1: print(w[1].message) # FOR DEBUG return len(w) # In the cases of "application/ld+json" and "application/rdf+xml", the # RDFLib raised a warning because the manually created QueryResult has no real # response value (implemented a fake read). # "WARNING:rdflib.term: does not look like a valid URI, trying to serialize this will break." self.assertEqual(0, _mime_vs_type("application/sparql-results+xml", XML)) self.assertEqual(0, _mime_vs_type("application/sparql-results+json", JSON)) self.assertEqual(0, _mime_vs_type("text/n3", N3)) self.assertEqual(0, _mime_vs_type("text/turtle", TURTLE)) self.assertEqual(0, _mime_vs_type("application/turtle", TURTLE)) self.assertEqual(0, _mime_vs_type("application/json", JSON)) > self.assertEqual(0, _mime_vs_type("application/ld+json", JSONLD)) E AssertionError: 0 != 1 test/test_wrapper.py:876: AssertionError =============================== warnings summary =============================== test/test_agrovoc-allegrograph_on_hold.py:164 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_agrovoc-allegrograph_on_hold.py:164: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_allegrograph__v4_14_1__mmi.py:163 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_allegrograph__v4_14_1__mmi.py:163: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_blazegraph__wikidata.py:172 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_blazegraph__wikidata.py:172: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_fuseki2__v3_6_0__agrovoc.py:164 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_fuseki2__v3_6_0__agrovoc.py:164: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_fuseki2__v3_8_0__stw.py:165 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_fuseki2__v3_8_0__stw.py:165: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_graphdbEnterprise__v8_9_0__rs.py:176 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_graphdbEnterprise__v8_9_0__rs.py:176: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_lov-fuseki_on_hold.py:167 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_lov-fuseki_on_hold.py:167: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_rdf4j__geosciml.py:173 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_rdf4j__geosciml.py:173: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_stardog__lindas.py:177 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_stardog__lindas.py:177: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_store__v1_1_4.py:162 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_store__v1_1_4.py:162: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_virtuoso__v7_20_3230__dbpedia.py:164 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_virtuoso__v7_20_3230__dbpedia.py:164: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_virtuoso__v8_03_3313__dbpedia.py:164 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_virtuoso__v8_03_3313__dbpedia.py:164: SyntaxWarning: invalid escape sequence '\:' queryWithCommaInCurie_2 = """ test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json-ld' in a 'ASK' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'n3' in a 'ASK' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 4 warnings test/test_allegrograph__v4_14_1__mmi.py: 8 warnings test/test_blazegraph__wikidata.py: 8 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 8 warnings test/test_fuseki2__v3_8_0__stw.py: 8 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 4 warnings test/test_lov-fuseki_on_hold.py: 8 warnings test/test_rdf4j__geosciml.py: 4 warnings test/test_stardog__lindas.py: 4 warnings test/test_store__v1_1_4.py: 8 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 8 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 8 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:348: SyntaxWarning: Ignore format 'foo'; current instance supports: json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld. warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 4 warnings test/test_allegrograph__v4_14_1__mmi.py: 8 warnings test/test_blazegraph__wikidata.py: 8 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 8 warnings test/test_fuseki2__v3_8_0__stw.py: 8 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 4 warnings test/test_rdf4j__geosciml.py: 4 warnings test/test_stardog__lindas.py: 4 warnings test/test_store__v1_1_4.py: 8 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:348: SyntaxWarning: Ignore format 'bar'; current instance supports: json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld. warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 2 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'csv' in a 'CONSTRUCT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 2 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 4 warnings test/test_fuseki2__v3_8_0__stw.py: 4 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json' in a 'CONSTRUCT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'csv' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 2 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 4 warnings test/test_fuseki2__v3_8_0__stw.py: 4 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 1 warning test/test_allegrograph__v4_14_1__mmi.py: 1 warning test/test_blazegraph__wikidata.py: 1 warning test/test_fuseki2__v3_6_0__agrovoc.py: 1 warning test/test_fuseki2__v3_8_0__stw.py: 1 warning test/test_graphdbEnterprise__v8_9_0__rs.py: 1 warning test/test_lov-fuseki_on_hold.py: 1 warning test/test_rdf4j__geosciml.py: 1 warning test/test_stardog__lindas.py: 1 warning test/test_store__v1_1_4.py: 1 warning test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:663: UserWarning: keepalive support not available, so the execution of this method has no effect warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 4 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 2 warnings test/test_wrapper.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json-ld' in a 'SELECT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_cli.py: 1 warning test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'n3' in a 'SELECT' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryRDF /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'rdf' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileRDFXML /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'rdf+xml' in a 'SELECT' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtle /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'turtle' in a 'SELECT' SPARQL query form warnings.warn( -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinJSONLD FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_cli.py::SPARQLWrapperCLIParser_Test::testInvalidFormat - Ass... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryRDF - urllib.error.U... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryTo4store - urllib.er... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToAgrovoc_AllegroGraph FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToAllegroGraph - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToBrazeGraph - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToFuseki2V3_6 - urll... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToFuseki2V3_8 - urll... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToGraphDBEnterprise FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToLovFuseki - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToRDF4J - urllib.err... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToStardog - urllib.e... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToVirtuosoV7 - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToVirtuosoV8 - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithEndpoint - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFile - urllib.er... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileCSV - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileN3 - urllib.... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileRDFXML - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTSV - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtle - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtleQuiet FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileXML - urllib... FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testKeepAlive - u... FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testKeepAlive - urll... FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryBadFormed_1 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testKeepAlive - urll... FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinCSV - ur... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSON - u... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testKeepAlive - urllib... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryBadFormed - u... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_wrapper.py::QueryResult_Test::testConvert - AssertionError: ... = 858 failed, 38 passed, 549 skipped, 80 xfailed, 381 warnings in 2205.90s (0:36:45) = E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build; python3.13 -m pytest test dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.13 returned exit code 13 make[1]: Leaving directory '/build/reproducible-path/sparql-wrapper-python-2.0.0' create-stamp debian/debhelper-build-stamp dh_testroot -O--buildsystem=pybuild dh_prep -O--buildsystem=pybuild dh_auto_install --destdir=debian/python3-sparqlwrapper/ -O--buildsystem=pybuild I: pybuild pybuild:308: rm -fr /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test I: pybuild base:311: /usr/bin/python3 setup.py install --root /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper running install /usr/lib/python3/dist-packages/setuptools/_distutils/cmd.py:79: SetuptoolsDeprecationWarning: setup.py install is deprecated. !! ******************************************************************************** Please avoid running ``setup.py`` directly. Instead, use pypa/build, pypa/installer or other standards-based tools. See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. ******************************************************************************** !! self.initialize_options() running build running build_py running install_lib creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/.gitignore -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/CACHEDIR.TAG -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/v/cache/lastfailed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/v/cache/nodeids -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/v/cache/stepwise -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/README.md -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__init__.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/main.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/SmartWrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/Wrapper.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/main.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/SPARQLExceptions.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/KeyCaseInsensitiveDict.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/sparql_dataframe.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/SmartWrapper.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/__init__.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/sparql_dataframe.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__init__.py to __init__.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/Wrapper.py to Wrapper.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/KeyCaseInsensitiveDict.py to KeyCaseInsensitiveDict.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/main.py to main.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/SmartWrapper.py to SmartWrapper.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/sparql_dataframe.py to sparql_dataframe.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/SPARQLExceptions.py to SPARQLExceptions.cpython-313.pyc running install_egg_info running egg_info creating SPARQLWrapper.egg-info writing SPARQLWrapper.egg-info/PKG-INFO writing dependency_links to SPARQLWrapper.egg-info/dependency_links.txt writing entry points to SPARQLWrapper.egg-info/entry_points.txt writing requirements to SPARQLWrapper.egg-info/requires.txt writing top-level names to SPARQLWrapper.egg-info/top_level.txt writing manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' reading manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files found matching 'Makefile' warning: no directories found matching 'docs/build/html' adding license file 'LICENSE.txt' adding license file 'AUTHORS.md' writing manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' Copying SPARQLWrapper.egg-info to /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper-2.0.0.egg-info Skipping SOURCES.txt running install_scripts Installing rqw script to /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/bin dh_installdocs -O--buildsystem=pybuild dh_installchangelogs -O--buildsystem=pybuild dh_installexamples -O--buildsystem=pybuild dh_python3 -O--buildsystem=pybuild dh_installsystemduser -O--buildsystem=pybuild dh_perl -O--buildsystem=pybuild dh_link -O--buildsystem=pybuild dh_strip_nondeterminism -O--buildsystem=pybuild dh_compress -O--buildsystem=pybuild dh_fixperms -O--buildsystem=pybuild dh_missing -O--buildsystem=pybuild dh_installdeb -O--buildsystem=pybuild dh_gencontrol -O--buildsystem=pybuild dh_md5sums -O--buildsystem=pybuild dh_builddeb -O--buildsystem=pybuild dpkg-deb: building package 'python3-sparqlwrapper' in '../python3-sparqlwrapper_2.0.0-2_all.deb'. dpkg-genbuildinfo --build=binary -O../sparql-wrapper-python_2.0.0-2_amd64.buildinfo dpkg-genchanges --build=binary -O../sparql-wrapper-python_2.0.0-2_amd64.changes dpkg-genchanges: info: binary-only upload (no source code included) dpkg-source --after-build . dpkg-buildpackage: info: binary-only upload (no source included) dpkg-genchanges: info: not including original source code in upload I: copying local configuration I: unmounting dev/ptmx filesystem I: unmounting dev/pts filesystem I: unmounting dev/shm filesystem I: unmounting proc filesystem I: unmounting sys filesystem I: cleaning the build env I: removing directory /srv/workspace/pbuilder/1216197 and its subdirectories I: Current time: Mon Mar 10 23:20:47 -12 2025 I: pbuilder-time-stamp: 1741692047 Tue Mar 11 11:20:49 UTC 2025 I: 1st build successful. Starting 2nd build on remote node ionos15-amd64.debian.net. Tue Mar 11 11:20:49 UTC 2025 I: Preparing to do remote build '2' on ionos15-amd64.debian.net. Tue Mar 11 11:33:06 UTC 2025 I: Deleting $TMPDIR on ionos15-amd64.debian.net. Tue Mar 11 11:33:06 UTC 2025 I: sparql-wrapper-python_2.0.0-2_amd64.changes: Format: 1.8 Date: Wed, 26 Jun 2024 09:15:38 +0200 Source: sparql-wrapper-python Binary: python3-sparqlwrapper Architecture: all Version: 2.0.0-2 Distribution: unstable Urgency: medium Maintainer: Debian Python Team <team+python@tracker.debian.org> Changed-By: Alexandre Detiste <tchet@debian.org> Description: python3-sparqlwrapper - SPARQL endpoint interface to Python3 Changes: sparql-wrapper-python (2.0.0-2) unstable; urgency=medium . * Team upload. * Release to unstable Checksums-Sha1: 20955763a0de33973ab0efc870efd6349b5b2c6c 39228 python3-sparqlwrapper_2.0.0-2_all.deb c7ef51ca940e4a10e83cd3807a58b1d8c932b3ee 5722 sparql-wrapper-python_2.0.0-2_amd64.buildinfo Checksums-Sha256: 7bf208644df2408ffd63ef24cb2f96b190e7f9bfe8b7ba1a1e33189ba85eca17 39228 python3-sparqlwrapper_2.0.0-2_all.deb d43ce159614af428fe3bfda404ec157408002c70c68ce560434fd989c87cb62c 5722 sparql-wrapper-python_2.0.0-2_amd64.buildinfo Files: 40f1ec5204cb5e56bb71b735ea1bdda1 39228 python optional python3-sparqlwrapper_2.0.0-2_all.deb 2a696b20a27a03988150aa99d281f7e4 5722 python optional sparql-wrapper-python_2.0.0-2_amd64.buildinfo Tue Mar 11 11:33:08 UTC 2025 I: diffoscope 289 will be used to compare the two builds: Running as unit: rb-diffoscope-amd64_27-49028.service # Profiling output for: /usr/bin/diffoscope --timeout 7200 --html /srv/reproducible-results/rbuild-debian/r-b-build.U3bS0lhH/sparql-wrapper-python_2.0.0-2.diffoscope.html --text /srv/reproducible-results/rbuild-debian/r-b-build.U3bS0lhH/sparql-wrapper-python_2.0.0-2.diffoscope.txt --json /srv/reproducible-results/rbuild-debian/r-b-build.U3bS0lhH/sparql-wrapper-python_2.0.0-2.diffoscope.json --profile=- /srv/reproducible-results/rbuild-debian/r-b-build.U3bS0lhH/b1/sparql-wrapper-python_2.0.0-2_amd64.changes /srv/reproducible-results/rbuild-debian/r-b-build.U3bS0lhH/b2/sparql-wrapper-python_2.0.0-2_amd64.changes ## command (total time: 0.000s) 0.000s 1 call cmp (internal) ## has_same_content_as (total time: 0.000s) 0.000s 1 call diffoscope.comparators.binary.FilesystemFile ## main (total time: 0.004s) 0.004s 2 calls outputs 0.000s 1 call cleanup Finished with result: success Main processes terminated with: code=exited/status=0 Service runtime: 263ms CPU time consumed: 263ms Tue Mar 11 11:33:08 UTC 2025 I: diffoscope 289 found no differences in the changes files, and a .buildinfo file also exists. Tue Mar 11 11:33:08 UTC 2025 I: sparql-wrapper-python from unstable built successfully and reproducibly on amd64. Tue Mar 11 11:33:10 UTC 2025 I: Submitting .buildinfo files to external archives: Tue Mar 11 11:33:10 UTC 2025 I: Submitting 8.0K b1/sparql-wrapper-python_2.0.0-2_amd64.buildinfo.asc Tue Mar 11 11:33:10 UTC 2025 I: Submitting 8.0K b2/sparql-wrapper-python_2.0.0-2_amd64.buildinfo.asc Tue Mar 11 11:33:11 UTC 2025 I: Done submitting .buildinfo files to http://buildinfo.debian.net/api/submit. Tue Mar 11 11:33:11 UTC 2025 I: Done submitting .buildinfo files. Tue Mar 11 11:33:11 UTC 2025 I: Removing signed sparql-wrapper-python_2.0.0-2_amd64.buildinfo.asc files: removed './b1/sparql-wrapper-python_2.0.0-2_amd64.buildinfo.asc' removed './b2/sparql-wrapper-python_2.0.0-2_amd64.buildinfo.asc'